Dec 16 06:51:00 crc systemd[1]: Starting Kubernetes Kubelet... Dec 16 06:51:00 crc restorecon[4762]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:00 crc restorecon[4762]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 06:51:01 crc restorecon[4762]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 16 06:51:01 crc kubenswrapper[4789]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:51:01 crc kubenswrapper[4789]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 16 06:51:01 crc kubenswrapper[4789]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:51:01 crc kubenswrapper[4789]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:51:01 crc kubenswrapper[4789]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 16 06:51:01 crc kubenswrapper[4789]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.888287 4789 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891405 4789 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891415 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891420 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891425 4789 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891429 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891433 4789 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891437 4789 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891443 4789 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891448 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891452 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891457 4789 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891462 4789 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891468 4789 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891474 4789 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891479 4789 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891485 4789 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891491 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891496 4789 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891502 4789 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891507 4789 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891517 4789 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891522 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891526 4789 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891530 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891535 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891538 4789 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891542 4789 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891546 4789 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891549 4789 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891553 4789 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891556 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891560 4789 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891564 4789 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891568 4789 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891572 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891576 4789 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891580 4789 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891583 4789 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891587 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891593 4789 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891598 4789 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891602 4789 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891605 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891610 4789 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891614 4789 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891619 4789 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891624 4789 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891628 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891632 4789 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891636 4789 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891642 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891646 4789 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891649 4789 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891653 4789 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891657 4789 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891660 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891668 4789 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891856 4789 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891860 4789 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891864 4789 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891867 4789 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891871 4789 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891875 4789 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891878 4789 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891881 4789 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891885 4789 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891888 4789 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891891 4789 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891895 4789 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891898 4789 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.891902 4789 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892186 4789 flags.go:64] FLAG: --address="0.0.0.0" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892197 4789 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892203 4789 flags.go:64] FLAG: --anonymous-auth="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892208 4789 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892214 4789 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892218 4789 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892224 4789 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892229 4789 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892233 4789 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892237 4789 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892242 4789 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892248 4789 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892252 4789 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892256 4789 flags.go:64] FLAG: --cgroup-root="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892260 4789 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892264 4789 flags.go:64] FLAG: --client-ca-file="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892268 4789 flags.go:64] FLAG: --cloud-config="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892273 4789 flags.go:64] FLAG: --cloud-provider="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892277 4789 flags.go:64] FLAG: --cluster-dns="[]" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892283 4789 flags.go:64] FLAG: --cluster-domain="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892287 4789 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892292 4789 flags.go:64] FLAG: --config-dir="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892296 4789 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892301 4789 flags.go:64] FLAG: --container-log-max-files="5" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892307 4789 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892311 4789 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892315 4789 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892319 4789 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892323 4789 flags.go:64] FLAG: --contention-profiling="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892327 4789 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892331 4789 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892336 4789 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892340 4789 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892345 4789 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892349 4789 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892353 4789 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892357 4789 flags.go:64] FLAG: --enable-load-reader="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892361 4789 flags.go:64] FLAG: --enable-server="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892365 4789 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892369 4789 flags.go:64] FLAG: --event-burst="100" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892374 4789 flags.go:64] FLAG: --event-qps="50" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892378 4789 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892382 4789 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892387 4789 flags.go:64] FLAG: --eviction-hard="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892392 4789 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892396 4789 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892400 4789 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892404 4789 flags.go:64] FLAG: --eviction-soft="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892408 4789 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892412 4789 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892416 4789 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892420 4789 flags.go:64] FLAG: --experimental-mounter-path="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892424 4789 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892428 4789 flags.go:64] FLAG: --fail-swap-on="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892432 4789 flags.go:64] FLAG: --feature-gates="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892437 4789 flags.go:64] FLAG: --file-check-frequency="20s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892441 4789 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892446 4789 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892450 4789 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892454 4789 flags.go:64] FLAG: --healthz-port="10248" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892458 4789 flags.go:64] FLAG: --help="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892462 4789 flags.go:64] FLAG: --hostname-override="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892466 4789 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892470 4789 flags.go:64] FLAG: --http-check-frequency="20s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892474 4789 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892478 4789 flags.go:64] FLAG: --image-credential-provider-config="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892482 4789 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892486 4789 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892489 4789 flags.go:64] FLAG: --image-service-endpoint="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892493 4789 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892497 4789 flags.go:64] FLAG: --kube-api-burst="100" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892501 4789 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892506 4789 flags.go:64] FLAG: --kube-api-qps="50" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892510 4789 flags.go:64] FLAG: --kube-reserved="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892514 4789 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892518 4789 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892522 4789 flags.go:64] FLAG: --kubelet-cgroups="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892526 4789 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892530 4789 flags.go:64] FLAG: --lock-file="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892534 4789 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892538 4789 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892542 4789 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892547 4789 flags.go:64] FLAG: --log-json-split-stream="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892553 4789 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892557 4789 flags.go:64] FLAG: --log-text-split-stream="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892561 4789 flags.go:64] FLAG: --logging-format="text" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892583 4789 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892588 4789 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892594 4789 flags.go:64] FLAG: --manifest-url="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892599 4789 flags.go:64] FLAG: --manifest-url-header="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892606 4789 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892611 4789 flags.go:64] FLAG: --max-open-files="1000000" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892617 4789 flags.go:64] FLAG: --max-pods="110" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892625 4789 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892629 4789 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892634 4789 flags.go:64] FLAG: --memory-manager-policy="None" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892638 4789 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892642 4789 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892646 4789 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892650 4789 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892663 4789 flags.go:64] FLAG: --node-status-max-images="50" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892667 4789 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892671 4789 flags.go:64] FLAG: --oom-score-adj="-999" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892675 4789 flags.go:64] FLAG: --pod-cidr="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892681 4789 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892687 4789 flags.go:64] FLAG: --pod-manifest-path="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892691 4789 flags.go:64] FLAG: --pod-max-pids="-1" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892695 4789 flags.go:64] FLAG: --pods-per-core="0" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892700 4789 flags.go:64] FLAG: --port="10250" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892704 4789 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892708 4789 flags.go:64] FLAG: --provider-id="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892712 4789 flags.go:64] FLAG: --qos-reserved="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892716 4789 flags.go:64] FLAG: --read-only-port="10255" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892720 4789 flags.go:64] FLAG: --register-node="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892724 4789 flags.go:64] FLAG: --register-schedulable="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892728 4789 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892734 4789 flags.go:64] FLAG: --registry-burst="10" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892738 4789 flags.go:64] FLAG: --registry-qps="5" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892742 4789 flags.go:64] FLAG: --reserved-cpus="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892746 4789 flags.go:64] FLAG: --reserved-memory="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892752 4789 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892755 4789 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892759 4789 flags.go:64] FLAG: --rotate-certificates="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892763 4789 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892767 4789 flags.go:64] FLAG: --runonce="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892771 4789 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892775 4789 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892779 4789 flags.go:64] FLAG: --seccomp-default="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892783 4789 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892787 4789 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892792 4789 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892797 4789 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892801 4789 flags.go:64] FLAG: --storage-driver-password="root" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892805 4789 flags.go:64] FLAG: --storage-driver-secure="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892809 4789 flags.go:64] FLAG: --storage-driver-table="stats" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892813 4789 flags.go:64] FLAG: --storage-driver-user="root" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892817 4789 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892821 4789 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892825 4789 flags.go:64] FLAG: --system-cgroups="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892829 4789 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892835 4789 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892840 4789 flags.go:64] FLAG: --tls-cert-file="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892844 4789 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892849 4789 flags.go:64] FLAG: --tls-min-version="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892853 4789 flags.go:64] FLAG: --tls-private-key-file="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892857 4789 flags.go:64] FLAG: --topology-manager-policy="none" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892861 4789 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892864 4789 flags.go:64] FLAG: --topology-manager-scope="container" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892868 4789 flags.go:64] FLAG: --v="2" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892874 4789 flags.go:64] FLAG: --version="false" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892879 4789 flags.go:64] FLAG: --vmodule="" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892884 4789 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.892888 4789 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893006 4789 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893011 4789 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893015 4789 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893019 4789 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893023 4789 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893027 4789 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893031 4789 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893035 4789 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893039 4789 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893042 4789 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893046 4789 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893049 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893063 4789 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893068 4789 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893072 4789 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893076 4789 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893080 4789 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893084 4789 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893088 4789 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893094 4789 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893098 4789 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893102 4789 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893106 4789 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893111 4789 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893114 4789 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893118 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893122 4789 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893126 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893130 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893134 4789 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893138 4789 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893141 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893145 4789 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893148 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893152 4789 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893155 4789 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893159 4789 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893162 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893165 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893169 4789 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893172 4789 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893176 4789 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893179 4789 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893182 4789 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893193 4789 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893196 4789 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893200 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893203 4789 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893214 4789 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893218 4789 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893222 4789 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893226 4789 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893232 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893236 4789 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893239 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893243 4789 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893246 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893249 4789 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893253 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893256 4789 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893260 4789 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893263 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893267 4789 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893270 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893273 4789 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893277 4789 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893281 4789 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893285 4789 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893290 4789 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893293 4789 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.893297 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.893711 4789 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.907451 4789 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.907499 4789 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908046 4789 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908142 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908159 4789 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908180 4789 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908198 4789 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908213 4789 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908236 4789 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908249 4789 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908261 4789 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908277 4789 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908290 4789 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908301 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908313 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908339 4789 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908351 4789 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908369 4789 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908383 4789 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908398 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908409 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908421 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908432 4789 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908443 4789 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908456 4789 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908466 4789 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908487 4789 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908499 4789 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908510 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908521 4789 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908595 4789 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908608 4789 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908619 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.908758 4789 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909142 4789 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909155 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909165 4789 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909175 4789 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909185 4789 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909195 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909203 4789 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909212 4789 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909221 4789 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909229 4789 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909237 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909245 4789 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909253 4789 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909261 4789 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909269 4789 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909277 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909285 4789 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909293 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909301 4789 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909309 4789 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909319 4789 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909328 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909336 4789 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909358 4789 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909366 4789 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909375 4789 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909383 4789 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909395 4789 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909413 4789 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909425 4789 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909441 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909453 4789 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909465 4789 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909474 4789 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909483 4789 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909491 4789 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909502 4789 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909513 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909523 4789 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.909539 4789 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909869 4789 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909880 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909892 4789 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909905 4789 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909943 4789 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909954 4789 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909963 4789 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909971 4789 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909979 4789 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909987 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.909995 4789 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910003 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910012 4789 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910020 4789 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910029 4789 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910037 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910045 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910053 4789 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910063 4789 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910071 4789 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910079 4789 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910089 4789 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910099 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910107 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910116 4789 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910124 4789 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910134 4789 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910142 4789 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910151 4789 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910159 4789 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910168 4789 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910176 4789 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910184 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910193 4789 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910201 4789 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910209 4789 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910217 4789 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910226 4789 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910235 4789 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910242 4789 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910250 4789 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910258 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910267 4789 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910275 4789 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910283 4789 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910292 4789 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910300 4789 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910310 4789 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910321 4789 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910332 4789 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910340 4789 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910350 4789 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910359 4789 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910367 4789 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910375 4789 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910384 4789 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910394 4789 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910405 4789 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910415 4789 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910426 4789 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910437 4789 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910448 4789 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910458 4789 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910466 4789 feature_gate.go:330] unrecognized feature gate: Example Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910476 4789 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910484 4789 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910492 4789 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910500 4789 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910511 4789 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910521 4789 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 06:51:01 crc kubenswrapper[4789]: W1216 06:51:01.910533 4789 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.910547 4789 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.910887 4789 server.go:940] "Client rotation is on, will bootstrap in background" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.915995 4789 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.916145 4789 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.917160 4789 server.go:997] "Starting client certificate rotation" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.917201 4789 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.917454 4789 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-25 01:40:03.33679248 +0000 UTC Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.917553 4789 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 210h49m1.41924286s for next certificate rotation Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.929939 4789 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.932974 4789 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.943763 4789 log.go:25] "Validated CRI v1 runtime API" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.964417 4789 log.go:25] "Validated CRI v1 image API" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.968847 4789 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.973181 4789 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-16-06-46-28-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 16 06:51:01 crc kubenswrapper[4789]: I1216 06:51:01.973250 4789 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.004200 4789 manager.go:217] Machine: {Timestamp:2025-12-16 06:51:02.001584896 +0000 UTC m=+0.263472615 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6f743a75-a9db-425a-b0df-337b667d61cb BootID:ab55fff6-145f-4b59-9cc0-4a36ab767ab4 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:3e:10:97 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:3e:10:97 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1c:c2:a2 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:44:21:8f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c9:c5:27 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:74:d4:c9 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:87:87:d1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c6:64:db:17:1f:b4 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a2:47:64:e3:4f:9c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.004717 4789 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.005255 4789 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.006554 4789 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.007020 4789 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.007108 4789 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.007544 4789 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.007568 4789 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.008074 4789 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.008141 4789 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.008479 4789 state_mem.go:36] "Initialized new in-memory state store" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.008816 4789 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.010073 4789 kubelet.go:418] "Attempting to sync node with API server" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.010105 4789 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.010179 4789 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.010197 4789 kubelet.go:324] "Adding apiserver pod source" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.010225 4789 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.013061 4789 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.014494 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.014504 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.014627 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.014634 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.014821 4789 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.015714 4789 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016376 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016406 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016437 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016449 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016464 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016482 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016491 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016504 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016515 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016525 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016538 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.016546 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.017038 4789 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.017740 4789 server.go:1280] "Started kubelet" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.018264 4789 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.018818 4789 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.019385 4789 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 06:51:02 crc systemd[1]: Started Kubernetes Kubelet. Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.020614 4789 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.021384 4789 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.021449 4789 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.021817 4789 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.021843 4789 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.021753 4789 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 06:39:29.587178666 +0000 UTC Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.022141 4789 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.022210 4789 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.022503 4789 server.go:460] "Adding debug handlers to kubelet server" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.022334 4789 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18819f76ebd7f475 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 06:51:02.017700981 +0000 UTC m=+0.279588620,LastTimestamp:2025-12-16 06:51:02.017700981 +0000 UTC m=+0.279588620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.023003 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.023069 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.026195 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="200ms" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.027813 4789 factory.go:55] Registering systemd factory Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.028126 4789 factory.go:221] Registration of the systemd container factory successfully Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.029187 4789 factory.go:153] Registering CRI-O factory Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.029210 4789 factory.go:221] Registration of the crio container factory successfully Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.029290 4789 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.029315 4789 factory.go:103] Registering Raw factory Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.029334 4789 manager.go:1196] Started watching for new ooms in manager Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.029994 4789 manager.go:319] Starting recovery of all containers Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044412 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044526 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044552 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044575 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044596 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044617 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044639 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044673 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044705 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044727 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044757 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044860 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044882 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.044907 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045030 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045054 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045083 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045113 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045142 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045171 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045200 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045229 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045791 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045821 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045848 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045869 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.045897 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046005 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046028 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046051 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046071 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046094 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046122 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046143 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046163 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046184 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046203 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046224 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046256 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046287 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046309 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046329 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046347 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046365 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046385 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046410 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046449 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046495 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046524 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046555 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046654 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046711 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046762 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046799 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046841 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046893 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.046984 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.047022 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.047051 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.047079 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.047106 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052188 4789 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052280 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052316 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052339 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052361 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052419 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052444 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052464 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052486 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052507 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052542 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052564 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052584 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052611 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052638 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052665 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052686 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052708 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052731 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052753 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052833 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052855 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052877 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052898 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052948 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.052972 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053002 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053031 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053050 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053070 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053089 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053113 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053143 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053553 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053579 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053606 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053636 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053672 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053702 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053724 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053744 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053764 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053783 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053804 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053838 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053862 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053887 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.053987 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054012 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054034 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054067 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054091 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054111 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054132 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054153 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054180 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054200 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054219 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054239 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054258 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054275 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054293 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054322 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054343 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054364 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054384 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054405 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054426 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054445 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054464 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054492 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054511 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054530 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054550 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054570 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054588 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054622 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054648 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054676 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054694 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054714 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054732 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054752 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054814 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054835 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054866 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054885 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054940 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054959 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054978 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.054997 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055019 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055046 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055099 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055124 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055143 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055162 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055182 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055208 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055237 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055268 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055288 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055307 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055327 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055346 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055364 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055386 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055407 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055425 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055444 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055463 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055482 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055500 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055520 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055540 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055559 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055578 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055598 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055622 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055649 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055669 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055689 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055709 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055728 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055748 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055767 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055785 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055804 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055833 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055856 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055906 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.055976 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056005 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056039 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056077 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056100 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056132 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056810 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056937 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056959 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.056978 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057024 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057041 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057061 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057103 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057121 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057140 4789 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057181 4789 reconstruct.go:97] "Volume reconstruction finished" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.057196 4789 reconciler.go:26] "Reconciler: start to sync state" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.061558 4789 manager.go:324] Recovery completed Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.077083 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.078561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.078602 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.078614 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.079556 4789 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.079591 4789 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.079624 4789 state_mem.go:36] "Initialized new in-memory state store" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.089637 4789 policy_none.go:49] "None policy: Start" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.092990 4789 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.093036 4789 state_mem.go:35] "Initializing new in-memory state store" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.101380 4789 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.103627 4789 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.103665 4789 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.103692 4789 kubelet.go:2335] "Starting kubelet main sync loop" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.103735 4789 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.105725 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.105819 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.122407 4789 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.145694 4789 manager.go:334] "Starting Device Plugin manager" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.145772 4789 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.145793 4789 server.go:79] "Starting device plugin registration server" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.146456 4789 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.146488 4789 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.146769 4789 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.146897 4789 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.146934 4789 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.155600 4789 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.204669 4789 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.204751 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.205959 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.206031 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.206059 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.206256 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.206422 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.206467 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207295 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207316 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207324 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207325 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207343 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207351 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207438 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207463 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.207589 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.208497 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.208539 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.208558 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.208689 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.208787 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.208872 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209086 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209108 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209117 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209571 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209630 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209640 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209829 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.209965 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.210008 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211158 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211215 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211239 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211283 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211306 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211317 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211478 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211501 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211510 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211628 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.211655 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.212552 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.212579 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.212590 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.226965 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="400ms" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.247156 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.248829 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.248864 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.248875 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.248901 4789 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.249453 4789 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.259564 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260025 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260183 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260325 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260399 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260429 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260473 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260513 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260565 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260601 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260695 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260777 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.260827 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363545 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363598 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363618 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363679 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363715 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363728 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363749 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363769 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363737 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363802 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363810 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363840 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363887 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363933 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363938 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363825 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364021 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364034 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363990 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363966 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.363970 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364130 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364144 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364150 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364172 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364182 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364199 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.364287 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.450291 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.451368 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.451405 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.451414 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.451442 4789 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.451945 4789 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.532096 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.549308 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.560143 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2cd5911637b7b2992bd52af024a0806520e4022a3562cdbd3b082df7c6fb5c20 WatchSource:0}: Error finding container 2cd5911637b7b2992bd52af024a0806520e4022a3562cdbd3b082df7c6fb5c20: Status 404 returned error can't find the container with id 2cd5911637b7b2992bd52af024a0806520e4022a3562cdbd3b082df7c6fb5c20 Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.566219 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-732db844ef50436da2a1fe294c283e7c2c253c578731c3417c50a9831ba44e20 WatchSource:0}: Error finding container 732db844ef50436da2a1fe294c283e7c2c253c578731c3417c50a9831ba44e20: Status 404 returned error can't find the container with id 732db844ef50436da2a1fe294c283e7c2c253c578731c3417c50a9831ba44e20 Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.574378 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.590678 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.592801 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0531843250dcdc75a6d248fd24e7afa3718f3ccea8d0baa46b6f09800f93c34b WatchSource:0}: Error finding container 0531843250dcdc75a6d248fd24e7afa3718f3ccea8d0baa46b6f09800f93c34b: Status 404 returned error can't find the container with id 0531843250dcdc75a6d248fd24e7afa3718f3ccea8d0baa46b6f09800f93c34b Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.596290 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.604140 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-24a07a06c591822814d5f0ba9cd9b2016a86da340ec42c6f1b1c9ad2d10cbbbf WatchSource:0}: Error finding container 24a07a06c591822814d5f0ba9cd9b2016a86da340ec42c6f1b1c9ad2d10cbbbf: Status 404 returned error can't find the container with id 24a07a06c591822814d5f0ba9cd9b2016a86da340ec42c6f1b1c9ad2d10cbbbf Dec 16 06:51:02 crc kubenswrapper[4789]: W1216 06:51:02.613369 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8d83fc4ee7ba12bb3c6d9b6162ff0326a4d91ff946f6fc5e3859933d64847e86 WatchSource:0}: Error finding container 8d83fc4ee7ba12bb3c6d9b6162ff0326a4d91ff946f6fc5e3859933d64847e86: Status 404 returned error can't find the container with id 8d83fc4ee7ba12bb3c6d9b6162ff0326a4d91ff946f6fc5e3859933d64847e86 Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.628219 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="800ms" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.852659 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.854090 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.854128 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.854143 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:02 crc kubenswrapper[4789]: I1216 06:51:02.854180 4789 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:51:02 crc kubenswrapper[4789]: E1216 06:51:02.854767 4789 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Dec 16 06:51:03 crc kubenswrapper[4789]: W1216 06:51:03.017443 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:03 crc kubenswrapper[4789]: E1216 06:51:03.017946 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.019079 4789 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.022304 4789 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 01:48:59.349830012 +0000 UTC Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.022371 4789 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 690h57m56.327462243s for next certificate rotation Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.113895 4789 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24" exitCode=0 Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.113995 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.114092 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8d83fc4ee7ba12bb3c6d9b6162ff0326a4d91ff946f6fc5e3859933d64847e86"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.114200 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.115865 4789 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="00e1338d46be4c39f1697b448c1a5723a8c283d8032fc59374f7c7abb24b4007" exitCode=0 Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.115942 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"00e1338d46be4c39f1697b448c1a5723a8c283d8032fc59374f7c7abb24b4007"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.115965 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"24a07a06c591822814d5f0ba9cd9b2016a86da340ec42c6f1b1c9ad2d10cbbbf"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.116105 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.116129 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.116140 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.116660 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.117694 4789 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f" exitCode=0 Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.117751 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.117794 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0531843250dcdc75a6d248fd24e7afa3718f3ccea8d0baa46b6f09800f93c34b"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.117817 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.117837 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.117847 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.117924 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.118537 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.119113 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.119165 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.119176 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.119242 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.119261 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.119273 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.121434 4789 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e" exitCode=0 Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.121517 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.121555 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"732db844ef50436da2a1fe294c283e7c2c253c578731c3417c50a9831ba44e20"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.121645 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.122183 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.122208 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.122218 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.123163 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2"} Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.123184 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2cd5911637b7b2992bd52af024a0806520e4022a3562cdbd3b082df7c6fb5c20"} Dec 16 06:51:03 crc kubenswrapper[4789]: W1216 06:51:03.131465 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:03 crc kubenswrapper[4789]: E1216 06:51:03.131553 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:03 crc kubenswrapper[4789]: W1216 06:51:03.250194 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:03 crc kubenswrapper[4789]: E1216 06:51:03.250289 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:03 crc kubenswrapper[4789]: W1216 06:51:03.411362 4789 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Dec 16 06:51:03 crc kubenswrapper[4789]: E1216 06:51:03.412126 4789 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Dec 16 06:51:03 crc kubenswrapper[4789]: E1216 06:51:03.428869 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="1.6s" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.654967 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.656881 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.656929 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.656941 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:03 crc kubenswrapper[4789]: I1216 06:51:03.656967 4789 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.130692 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.130746 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.130756 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.130769 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.130778 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.130891 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.132179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.132234 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.132251 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.133671 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"803cd9461fe634bcc8e29801c42e4ebecb48a692ffc83cf3de801a29f6b6bc85"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.133928 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.135180 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.135212 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.135222 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.136929 4789 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd" exitCode=0 Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.136987 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.137214 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.138387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.138444 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.138464 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.140733 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.140762 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.140774 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.140997 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.142297 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.142323 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.142332 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.144522 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.144546 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.144557 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3"} Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.144608 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.145244 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.145262 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.145271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:04 crc kubenswrapper[4789]: I1216 06:51:04.722281 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.150023 4789 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a" exitCode=0 Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.150131 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a"} Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.150203 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.150291 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.150291 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.151548 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.151577 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.151592 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.151603 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.151611 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.151628 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.152120 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.152157 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.152171 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:05 crc kubenswrapper[4789]: I1216 06:51:05.513005 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.157986 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335"} Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.158046 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.158085 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270"} Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.158116 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.158121 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e"} Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.158276 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71"} Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.158991 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.159020 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.159029 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.159483 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.159560 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.159595 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.885754 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.987616 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:06 crc kubenswrapper[4789]: I1216 06:51:06.995793 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.164930 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a"} Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.164954 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.164985 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.166100 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.166139 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.166147 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.166876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.166907 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.166935 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.663888 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.664091 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.664136 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.665630 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.665715 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:07 crc kubenswrapper[4789]: I1216 06:51:07.665746 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.033092 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.168472 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.168566 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.169797 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.169832 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.169847 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.170541 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.170599 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.170621 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.365411 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.365654 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.365713 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.368050 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.368115 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.368139 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:08 crc kubenswrapper[4789]: I1216 06:51:08.537024 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.171991 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.172186 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.174072 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.174239 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.174266 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.174104 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.174384 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.174411 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.885871 4789 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:51:09 crc kubenswrapper[4789]: I1216 06:51:09.886045 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:51:11 crc kubenswrapper[4789]: I1216 06:51:11.250904 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 16 06:51:11 crc kubenswrapper[4789]: I1216 06:51:11.251159 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:11 crc kubenswrapper[4789]: I1216 06:51:11.252424 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:11 crc kubenswrapper[4789]: I1216 06:51:11.252472 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:11 crc kubenswrapper[4789]: I1216 06:51:11.252485 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:11 crc kubenswrapper[4789]: I1216 06:51:11.318366 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 16 06:51:12 crc kubenswrapper[4789]: E1216 06:51:12.155714 4789 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 06:51:12 crc kubenswrapper[4789]: I1216 06:51:12.179101 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:12 crc kubenswrapper[4789]: I1216 06:51:12.180020 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:12 crc kubenswrapper[4789]: I1216 06:51:12.180071 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:12 crc kubenswrapper[4789]: I1216 06:51:12.180088 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:13 crc kubenswrapper[4789]: E1216 06:51:13.658302 4789 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 16 06:51:14 crc kubenswrapper[4789]: I1216 06:51:14.020902 4789 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 16 06:51:14 crc kubenswrapper[4789]: I1216 06:51:14.795681 4789 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 06:51:14 crc kubenswrapper[4789]: I1216 06:51:14.795762 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 06:51:14 crc kubenswrapper[4789]: I1216 06:51:14.800070 4789 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 06:51:14 crc kubenswrapper[4789]: I1216 06:51:14.800166 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.259008 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.260390 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.260420 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.260430 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.260452 4789 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.519005 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.519170 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.523202 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.523242 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:15 crc kubenswrapper[4789]: I1216 06:51:15.523253 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:18 crc kubenswrapper[4789]: I1216 06:51:18.371857 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:18 crc kubenswrapper[4789]: I1216 06:51:18.372035 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:18 crc kubenswrapper[4789]: I1216 06:51:18.375239 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:18 crc kubenswrapper[4789]: I1216 06:51:18.375283 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:18 crc kubenswrapper[4789]: I1216 06:51:18.375297 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:18 crc kubenswrapper[4789]: I1216 06:51:18.376204 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.198289 4789 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.200026 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.200077 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.200089 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.788834 4789 trace.go:236] Trace[786692283]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:51:05.224) (total time: 14564ms): Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[786692283]: ---"Objects listed" error: 14564ms (06:51:19.788) Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[786692283]: [14.564250342s] [14.564250342s] END Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.788868 4789 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.789607 4789 trace.go:236] Trace[1101763167]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:51:05.891) (total time: 13897ms): Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[1101763167]: ---"Objects listed" error: 13897ms (06:51:19.789) Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[1101763167]: [13.897732628s] [13.897732628s] END Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.789632 4789 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.791314 4789 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 16 06:51:19 crc kubenswrapper[4789]: E1216 06:51:19.800820 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.801711 4789 trace.go:236] Trace[1620664621]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:51:05.924) (total time: 13877ms): Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[1620664621]: ---"Objects listed" error: 13877ms (06:51:19.801) Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[1620664621]: [13.877572981s] [13.877572981s] END Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.801734 4789 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.803813 4789 trace.go:236] Trace[733551254]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 06:51:05.074) (total time: 14728ms): Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[733551254]: ---"Objects listed" error: 14728ms (06:51:19.803) Dec 16 06:51:19 crc kubenswrapper[4789]: Trace[733551254]: [14.728888672s] [14.728888672s] END Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.803838 4789 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.840703 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:19 crc kubenswrapper[4789]: I1216 06:51:19.845223 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.020912 4789 apiserver.go:52] "Watching apiserver" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.024300 4789 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.024559 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-ckj56","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.024862 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.024996 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.025199 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.025268 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.025589 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.025271 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.025643 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.025320 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.025795 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.025851 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.027866 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.027885 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.028390 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.032492 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.035011 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.035919 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.036086 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.036220 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.036396 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.037385 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.038560 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.040475 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.099239 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.123549 4789 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.128524 4789 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.128576 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.128985 4789 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.129036 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.129397 4789 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.129428 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.140235 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.151177 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.165010 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.179731 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.191906 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193057 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193097 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193115 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193132 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193151 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193170 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193186 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193201 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193218 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193253 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193268 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193283 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193298 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193314 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193330 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193344 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193359 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193375 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193391 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193406 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193422 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193471 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193488 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193503 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193499 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193521 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193537 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193552 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193567 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193565 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193585 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193651 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193665 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193675 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193697 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193715 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193733 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193749 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193766 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193782 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193800 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193812 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193817 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193837 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193863 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193885 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193907 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193940 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193956 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193968 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.193971 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194004 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194023 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194027 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194043 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194090 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194108 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194126 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194142 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194161 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194178 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194195 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194240 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194262 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194287 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194303 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194305 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194320 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194355 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194355 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194371 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194387 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194404 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194411 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194420 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194436 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194453 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194460 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194469 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194485 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194501 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194516 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194518 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194532 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194550 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194552 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194566 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194582 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194600 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194617 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194651 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194668 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194683 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194700 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194720 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194736 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194752 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194770 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194785 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194800 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194814 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194807 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194834 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194852 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194866 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194883 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194899 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194918 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194947 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194962 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194977 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194993 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195009 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195024 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195039 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195054 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195070 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195085 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195102 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195120 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195137 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195192 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195210 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195229 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195249 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195269 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195288 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195304 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195319 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195336 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195351 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195365 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195383 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195399 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195417 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195435 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195451 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195468 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195487 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195504 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195526 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195544 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195560 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195576 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195592 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195609 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195627 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195645 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195661 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195676 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195703 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195721 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195738 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195755 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195771 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195787 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195803 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195822 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195840 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195857 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195874 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195890 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195906 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195935 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196026 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196046 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196065 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196081 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196099 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196115 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196130 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196151 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196168 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196186 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196203 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196221 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196239 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196256 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196274 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196291 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196308 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196323 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196340 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196356 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196397 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196418 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196435 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196452 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196468 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196485 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196501 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196519 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196535 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196552 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196569 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196586 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196604 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196620 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196638 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196656 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196675 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196693 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196711 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196728 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196747 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196764 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196781 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196799 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196815 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196849 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196878 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196896 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196917 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196963 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196982 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197000 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197021 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197041 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197114 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197132 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197153 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqkj\" (UniqueName: \"kubernetes.io/projected/483d75f6-45a1-4182-b56b-9eff94bbed13-kube-api-access-hdqkj\") pod \"node-resolver-ckj56\" (UID: \"483d75f6-45a1-4182-b56b-9eff94bbed13\") " pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197174 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197196 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/483d75f6-45a1-4182-b56b-9eff94bbed13-hosts-file\") pod \"node-resolver-ckj56\" (UID: \"483d75f6-45a1-4182-b56b-9eff94bbed13\") " pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197437 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197459 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197507 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197518 4789 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197528 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197539 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197549 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197559 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197569 4789 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197579 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197589 4789 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197599 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197611 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197622 4789 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197632 4789 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197644 4789 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.194945 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195040 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195084 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195101 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195253 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195372 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195407 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195410 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195536 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195610 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195615 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195626 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195698 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195792 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195812 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.195847 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196004 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196012 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196032 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196163 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196346 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196374 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196406 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196566 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196604 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211654 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196617 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212471 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.213428 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.213468 4789 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214455 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214870 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196673 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196782 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196833 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196964 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.196993 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197022 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197177 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197197 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197210 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197344 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197391 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197504 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197509 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.197586 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.203081 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.203530 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.203902 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.203960 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204075 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204195 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204354 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204400 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204487 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204641 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204745 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204802 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204823 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.204931 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.205221 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.205324 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.205401 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.205599 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.205646 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.205668 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.205735 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.206156 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.206310 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.206536 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.206579 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.206951 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207068 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207231 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207252 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207380 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207464 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207560 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207843 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207901 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.207997 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.208556 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.209052 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.209194 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.209662 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.209786 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.209806 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.209898 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.210159 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.210427 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.210546 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.210551 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.210891 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211034 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211045 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211106 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211160 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211190 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211317 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211476 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211735 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211750 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211765 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211775 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211787 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.211950 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212020 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.212077 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212286 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212485 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212513 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212549 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.212563 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212622 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212733 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212833 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.212858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.213091 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.213234 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214130 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214015 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214248 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214352 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.214368 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214674 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214731 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214924 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.214966 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.216333 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.216360 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.216369 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.216628 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.216705 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.216835 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.217040 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.217053 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.217287 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.219511 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.219540 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.220773 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:20.716663215 +0000 UTC m=+18.978550944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.220999 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:20.720983306 +0000 UTC m=+18.982871035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.221158 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.221208 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.221711 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.222339 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:51:20.722325442 +0000 UTC m=+18.984213071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.222434 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.222919 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.223319 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.223562 4789 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6" exitCode=255 Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.223814 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.224207 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.224224 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.224619 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.224717 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.224777 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.225018 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.226648 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.227341 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.227881 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.227996 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:20.727971098 +0000 UTC m=+18.989858727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.227993 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.228375 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.228452 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.228565 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.228792 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.229622 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.229981 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.230426 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.231157 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.231190 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.231262 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.231310 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.231453 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.231868 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.232774 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.232959 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.232988 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.233833 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.237151 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.237315 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.237862 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.238288 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.242426 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.242470 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.242499 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.242827 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.244330 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.244520 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.244796 4789 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.246639 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:20.746591507 +0000 UTC m=+19.008479136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.248231 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.248257 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.248647 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.249063 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.249315 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.250636 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.251507 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.251508 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.251967 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.251953 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.251984 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.252053 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.252264 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.252554 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.252761 4789 scope.go:117] "RemoveContainer" containerID="a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.252826 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.264863 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.272411 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.285764 4789 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.286041 4789 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.290538 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.290582 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.290593 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.290610 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.290622 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.296503 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.299251 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.308744 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.308970 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqkj\" (UniqueName: \"kubernetes.io/projected/483d75f6-45a1-4182-b56b-9eff94bbed13-kube-api-access-hdqkj\") pod \"node-resolver-ckj56\" (UID: \"483d75f6-45a1-4182-b56b-9eff94bbed13\") " pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.309066 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/483d75f6-45a1-4182-b56b-9eff94bbed13-hosts-file\") pod \"node-resolver-ckj56\" (UID: \"483d75f6-45a1-4182-b56b-9eff94bbed13\") " pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.309115 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320519 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320547 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320562 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320573 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320583 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320593 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320605 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320614 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320622 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320632 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320640 4789 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320649 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320657 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320671 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320679 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320687 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320696 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320706 4789 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320714 4789 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320723 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320733 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320744 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320752 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320761 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320771 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320779 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320787 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320795 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320806 4789 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320814 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320822 4789 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320831 4789 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320841 4789 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320850 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320858 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320869 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320880 4789 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320889 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320897 4789 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320911 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320933 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320941 4789 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320949 4789 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320960 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320969 4789 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320977 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320985 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.320997 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321006 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321016 4789 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321029 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321041 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321049 4789 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321057 4789 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321069 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321079 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321088 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321097 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321107 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321115 4789 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321124 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321133 4789 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321143 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321152 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321160 4789 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321186 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321195 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321204 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321212 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321223 4789 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321232 4789 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321242 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321252 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321263 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321273 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321282 4789 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321292 4789 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321302 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321310 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321319 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321331 4789 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321339 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321347 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321357 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321368 4789 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321377 4789 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321386 4789 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321394 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321405 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321413 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321421 4789 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321433 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321442 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321450 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321458 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321469 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321477 4789 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321486 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321495 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321506 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321515 4789 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321523 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321534 4789 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321543 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321556 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321565 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321576 4789 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321585 4789 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321593 4789 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321601 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321611 4789 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321619 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321628 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321637 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321648 4789 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321657 4789 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321666 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321676 4789 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321684 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321693 4789 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321701 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321711 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321719 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321729 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321737 4789 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321749 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321757 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321766 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321775 4789 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321785 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321793 4789 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321801 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321812 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321820 4789 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321829 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321837 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321849 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321857 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321865 4789 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321873 4789 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321883 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321892 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321902 4789 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321915 4789 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321935 4789 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321943 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321952 4789 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321962 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321971 4789 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321979 4789 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321987 4789 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.321998 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322006 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322014 4789 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322022 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322015 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322032 4789 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322077 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322090 4789 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322100 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322114 4789 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322122 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322131 4789 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322139 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322149 4789 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322157 4789 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322166 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322182 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322191 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322199 4789 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322207 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322217 4789 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322225 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322233 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322241 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322251 4789 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322260 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322268 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322279 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322403 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.322695 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/483d75f6-45a1-4182-b56b-9eff94bbed13-hosts-file\") pod \"node-resolver-ckj56\" (UID: \"483d75f6-45a1-4182-b56b-9eff94bbed13\") " pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.325677 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.329732 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.336302 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.345769 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.345802 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.345811 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.345824 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.345833 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.346120 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.347608 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqkj\" (UniqueName: \"kubernetes.io/projected/483d75f6-45a1-4182-b56b-9eff94bbed13-kube-api-access-hdqkj\") pod \"node-resolver-ckj56\" (UID: \"483d75f6-45a1-4182-b56b-9eff94bbed13\") " pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.347726 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ckj56" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.360727 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.367198 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.368096 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.369221 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.378042 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.378131 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.378145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.378193 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.378208 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.380055 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: W1216 06:51:20.390704 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod483d75f6_45a1_4182_b56b_9eff94bbed13.slice/crio-7ff0353a8d03b50a720030fc0dd186b9a21aa74ee0272e6bcfaf67b06baa518f WatchSource:0}: Error finding container 7ff0353a8d03b50a720030fc0dd186b9a21aa74ee0272e6bcfaf67b06baa518f: Status 404 returned error can't find the container with id 7ff0353a8d03b50a720030fc0dd186b9a21aa74ee0272e6bcfaf67b06baa518f Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.394167 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.395227 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: W1216 06:51:20.399233 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-70d6a34977d9ea6bb38cf7affc4d80b0f5f756eb730c6df5e614421d209badc7 WatchSource:0}: Error finding container 70d6a34977d9ea6bb38cf7affc4d80b0f5f756eb730c6df5e614421d209badc7: Status 404 returned error can't find the container with id 70d6a34977d9ea6bb38cf7affc4d80b0f5f756eb730c6df5e614421d209badc7 Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.399463 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.399518 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.399531 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.399600 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.399618 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.408794 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.409504 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.412851 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.412876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.412886 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.412900 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.412925 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.419180 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.422695 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.422718 4789 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.424257 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.424454 4789 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.427368 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.427399 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.427408 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.427422 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.427431 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.429475 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.441593 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.456021 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.468922 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.530571 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.530604 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.530613 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.530625 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.530635 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.633532 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.634097 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.634151 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.634175 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.634187 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.725058 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.725207 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.725295 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.725392 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.725502 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:21.72548122 +0000 UTC m=+19.987368849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.725514 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.725625 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:21.725599671 +0000 UTC m=+19.987487510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.725754 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:51:21.725721394 +0000 UTC m=+19.987609023 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.736406 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.736448 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.736458 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.736474 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.736484 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.825733 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.825777 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.825886 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.825901 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.825914 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.825991 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:21.825976254 +0000 UTC m=+20.087863883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.826017 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.826063 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.826087 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: E1216 06:51:20.826182 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:21.826141686 +0000 UTC m=+20.088029385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.838977 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.839193 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.839264 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.839340 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.839410 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.942268 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.942498 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.942573 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.942639 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:20 crc kubenswrapper[4789]: I1216 06:51:20.942704 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:20Z","lastTransitionTime":"2025-12-16T06:51:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.045512 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.045799 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.045884 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.046006 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.046097 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.147944 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.148007 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.148022 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.148047 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.148063 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.228254 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.228332 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.228358 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"70d6a34977d9ea6bb38cf7affc4d80b0f5f756eb730c6df5e614421d209badc7"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.230345 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.230499 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f8ed0c084fbd0d1bc257e0a3ce271e529bbc7ff7e52d0a3307a2fbe7878b682c"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.231388 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"86288b7f5bbd2e7e1db432c675bde716e571af13b7505881792a11953ff20b14"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.232603 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ckj56" event={"ID":"483d75f6-45a1-4182-b56b-9eff94bbed13","Type":"ContainerStarted","Data":"c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.232758 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ckj56" event={"ID":"483d75f6-45a1-4182-b56b-9eff94bbed13","Type":"ContainerStarted","Data":"7ff0353a8d03b50a720030fc0dd186b9a21aa74ee0272e6bcfaf67b06baa518f"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.234869 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.236580 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.250341 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.250389 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.250399 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.250411 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.250421 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.250673 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.261568 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.282989 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.301683 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.312670 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.327855 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.344135 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.352119 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.352149 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.352158 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.352172 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.352181 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.357253 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.360094 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.371987 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.372749 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.387196 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.403888 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.420429 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.435890 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.448133 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.454585 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.454629 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.454642 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.454657 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.454670 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.459956 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.469824 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.472572 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.485039 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.503839 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.557171 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.557204 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.557214 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.557229 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.557241 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.659071 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.659096 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.659104 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.659119 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.659129 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.707104 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pdg87"] Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.707420 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.714273 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.714593 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbvfm"] Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.714718 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.715032 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.715399 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.715476 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-58dsj"] Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.715498 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.716248 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.716677 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-b8tnx"] Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.717211 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.721846 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.731548 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.731608 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.731635 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.731712 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.731755 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:23.731742937 +0000 UTC m=+21.993630566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.732054 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.732090 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:51:23.732061711 +0000 UTC m=+21.993949350 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.732120 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:23.732109681 +0000 UTC m=+21.993997410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.732536 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.732566 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.732578 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.732588 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.732672 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.732964 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.733073 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.733173 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.734507 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.734513 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.734509 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.734558 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.735831 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.742134 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.761455 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.761508 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.761519 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.761536 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.761549 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.770966 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.786691 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.808982 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.831722 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832135 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-slash\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832158 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blqv4\" (UniqueName: \"kubernetes.io/projected/02a3f8b3-6393-4e58-9b49-506f85204b08-kube-api-access-blqv4\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832175 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-var-lib-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832188 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-cnibin\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832201 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-kubelet\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832255 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32431466-a255-4bf2-9237-4f48eab4a71e-cni-binary-copy\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832271 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5shb\" (UniqueName: \"kubernetes.io/projected/32431466-a255-4bf2-9237-4f48eab4a71e-kube-api-access-m5shb\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832286 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-os-release\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832298 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-netns\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832312 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-log-socket\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832337 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgv9\" (UniqueName: \"kubernetes.io/projected/529ecdde-d194-4bf4-9e89-4accd6630349-kube-api-access-xcgv9\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832364 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-kubelet\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832386 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-proxy-tls\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832558 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832574 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/529ecdde-d194-4bf4-9e89-4accd6630349-cni-binary-copy\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832646 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-cni-multus\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832694 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-hostroot\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.832739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-multus-certs\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833005 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktmbw\" (UniqueName: \"kubernetes.io/projected/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-kube-api-access-ktmbw\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833033 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833048 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-script-lib\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833067 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32431466-a255-4bf2-9237-4f48eab4a71e-multus-daemon-config\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833190 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/529ecdde-d194-4bf4-9e89-4accd6630349-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833293 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-env-overrides\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833330 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-cni-bin\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833358 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-etc-kubernetes\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833403 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833446 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-node-log\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833498 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833532 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-bin\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833559 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-config\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833589 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-netns\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.833602 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.833621 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833621 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-conf-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.833635 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.833688 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:23.833676319 +0000 UTC m=+22.095563948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833723 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-etc-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833782 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-mcd-auth-proxy-config\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833803 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-systemd-units\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833821 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-os-release\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833835 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-ovn\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833850 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-netd\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833863 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-cni-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833878 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-k8s-cni-cncf-io\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.833961 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834065 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-system-cni-dir\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834098 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-systemd\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834319 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-rootfs\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834337 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.834181 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.834387 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.834397 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:21 crc kubenswrapper[4789]: E1216 06:51:21.834426 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:23.834417287 +0000 UTC m=+22.096304916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834465 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-system-cni-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834490 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-socket-dir-parent\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834508 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-cnibin\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.834526 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02a3f8b3-6393-4e58-9b49-506f85204b08-ovn-node-metrics-cert\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.844464 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.858796 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.864159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.864209 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.864222 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.864241 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.864250 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.875441 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.889446 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.902912 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.914002 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.927880 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935081 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-bin\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935149 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-config\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935169 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-netns\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935185 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-conf-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935205 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-etc-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935232 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-mcd-auth-proxy-config\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935255 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-systemd-units\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935277 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-os-release\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935268 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-netns\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935298 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-conf-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935299 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-ovn\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935350 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-systemd-units\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935342 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-etc-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935383 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-ovn\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935375 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-os-release\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935374 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-netd\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935416 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-netd\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935452 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-cni-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935449 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-bin\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935509 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-cni-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935539 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-k8s-cni-cncf-io\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935694 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-system-cni-dir\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935726 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-systemd\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935744 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-k8s-cni-cncf-io\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935774 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-system-cni-dir\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935774 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-rootfs\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935795 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-systemd\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935747 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-rootfs\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935827 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935869 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-system-cni-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935885 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-socket-dir-parent\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935920 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-cnibin\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935938 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02a3f8b3-6393-4e58-9b49-506f85204b08-ovn-node-metrics-cert\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935969 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-slash\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935977 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-mcd-auth-proxy-config\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.935985 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blqv4\" (UniqueName: \"kubernetes.io/projected/02a3f8b3-6393-4e58-9b49-506f85204b08-kube-api-access-blqv4\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936024 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-var-lib-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936044 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-cnibin\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936064 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-kubelet\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936081 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32431466-a255-4bf2-9237-4f48eab4a71e-cni-binary-copy\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936096 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5shb\" (UniqueName: \"kubernetes.io/projected/32431466-a255-4bf2-9237-4f48eab4a71e-kube-api-access-m5shb\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936118 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-os-release\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936132 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-netns\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936147 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-log-socket\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936162 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgv9\" (UniqueName: \"kubernetes.io/projected/529ecdde-d194-4bf4-9e89-4accd6630349-kube-api-access-xcgv9\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936179 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-kubelet\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936197 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-proxy-tls\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936213 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936218 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-config\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936228 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/529ecdde-d194-4bf4-9e89-4accd6630349-cni-binary-copy\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936257 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936265 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-cni-multus\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936287 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-hostroot\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936295 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-system-cni-dir\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936304 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-multus-certs\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936328 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-multus-socket-dir-parent\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936337 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktmbw\" (UniqueName: \"kubernetes.io/projected/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-kube-api-access-ktmbw\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936354 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-cnibin\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936354 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936377 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936386 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-script-lib\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936402 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-cni-multus\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936410 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32431466-a255-4bf2-9237-4f48eab4a71e-multus-daemon-config\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936431 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-hostroot\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936432 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/529ecdde-d194-4bf4-9e89-4accd6630349-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936451 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-env-overrides\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936468 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-cni-bin\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936486 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-etc-kubernetes\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936522 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-node-log\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936542 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936596 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936758 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/529ecdde-d194-4bf4-9e89-4accd6630349-cni-binary-copy\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936794 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-var-lib-openvswitch\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936836 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-cnibin\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936866 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-kubelet\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937023 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-env-overrides\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937060 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-slash\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937192 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-log-socket\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937277 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-os-release\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937291 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-netns\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937352 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-kubelet\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937524 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32431466-a255-4bf2-9237-4f48eab4a71e-cni-binary-copy\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937550 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-var-lib-cni-bin\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937554 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/529ecdde-d194-4bf4-9e89-4accd6630349-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.936454 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-host-run-multus-certs\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937578 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32431466-a255-4bf2-9237-4f48eab4a71e-etc-kubernetes\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937580 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-script-lib\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937595 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-node-log\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.937870 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32431466-a255-4bf2-9237-4f48eab4a71e-multus-daemon-config\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.938324 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/529ecdde-d194-4bf4-9e89-4accd6630349-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.947413 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.947868 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02a3f8b3-6393-4e58-9b49-506f85204b08-ovn-node-metrics-cert\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.949763 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-proxy-tls\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.952835 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blqv4\" (UniqueName: \"kubernetes.io/projected/02a3f8b3-6393-4e58-9b49-506f85204b08-kube-api-access-blqv4\") pod \"ovnkube-node-pbvfm\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.953684 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktmbw\" (UniqueName: \"kubernetes.io/projected/ca24a4b9-4b99-4de7-887d-f8804a4f06bb-kube-api-access-ktmbw\") pod \"machine-config-daemon-pdg87\" (UID: \"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\") " pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.954857 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgv9\" (UniqueName: \"kubernetes.io/projected/529ecdde-d194-4bf4-9e89-4accd6630349-kube-api-access-xcgv9\") pod \"multus-additional-cni-plugins-b8tnx\" (UID: \"529ecdde-d194-4bf4-9e89-4accd6630349\") " pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.957474 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5shb\" (UniqueName: \"kubernetes.io/projected/32431466-a255-4bf2-9237-4f48eab4a71e-kube-api-access-m5shb\") pod \"multus-58dsj\" (UID: \"32431466-a255-4bf2-9237-4f48eab4a71e\") " pod="openshift-multus/multus-58dsj" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.962044 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.966011 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.966038 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.966048 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.966064 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.966075 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:21Z","lastTransitionTime":"2025-12-16T06:51:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.975083 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:21 crc kubenswrapper[4789]: I1216 06:51:21.992978 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:21Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.006604 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.018121 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.018570 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.026395 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.032624 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.032818 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-58dsj" Dec 16 06:51:22 crc kubenswrapper[4789]: W1216 06:51:22.036246 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a3f8b3_6393_4e58_9b49_506f85204b08.slice/crio-36ff7caac3087898e4430aa6803975a6c8c89772ccd913cca511135dc7bc72ff WatchSource:0}: Error finding container 36ff7caac3087898e4430aa6803975a6c8c89772ccd913cca511135dc7bc72ff: Status 404 returned error can't find the container with id 36ff7caac3087898e4430aa6803975a6c8c89772ccd913cca511135dc7bc72ff Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.039366 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.045995 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.060244 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.069433 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.069464 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.069473 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.069489 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.069498 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.071341 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.085501 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.105031 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.105115 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:22 crc kubenswrapper[4789]: E1216 06:51:22.105179 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:22 crc kubenswrapper[4789]: E1216 06:51:22.105261 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.105306 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:22 crc kubenswrapper[4789]: E1216 06:51:22.105418 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.105481 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.111738 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.113176 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.114876 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.115673 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.117690 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.117865 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.118431 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.119116 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.120471 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.121197 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.122160 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.122705 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.123431 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.124496 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.125368 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.126533 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.127046 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.128030 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.128399 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.128953 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.129542 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.129715 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.130500 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.131057 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.131877 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.132565 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.133371 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.133978 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.135237 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.135723 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.136803 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.137318 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.137774 4789 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.138250 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.139896 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.140433 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.141359 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.141768 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.142851 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.143814 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.145099 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.146264 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.147373 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.148430 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.149319 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.150493 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.151584 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.152212 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.152802 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.153810 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.154526 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.154756 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.155968 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.156800 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.157479 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.158038 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.158678 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.159635 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.165553 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.171412 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.171457 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.171471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.171492 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.171504 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.180771 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.193583 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.207095 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.232090 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.241579 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerStarted","Data":"771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.241660 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerStarted","Data":"553e7020ee3ab7c62393b786ce8da1359a572112ddfe5c3e94fa8cb9b6b0c4bf"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.242333 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerStarted","Data":"887336ec86effca6bec26617a50e1e0f94ef2d6538f376ff2b4baf5ed657659a"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.243772 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320" exitCode=0 Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.243829 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.243847 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"36ff7caac3087898e4430aa6803975a6c8c89772ccd913cca511135dc7bc72ff"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.247145 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.247751 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.247777 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"f699fd6506859f4cd9ac7ab5fec0b49b67e13e766d5d2fe8841371e19c161695"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.253145 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.270775 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.274857 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.274897 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.274913 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.274955 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.274967 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.287314 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.303871 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.315731 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.325693 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.346764 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.381212 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.381251 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.381264 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.381279 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.381289 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.412143 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.431055 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.448294 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.462943 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.479740 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.484130 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.484177 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.484191 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.484212 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.484228 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.492072 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.518099 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.536498 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.552760 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.582496 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.592586 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.592623 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.592665 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.592689 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.592711 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.639268 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.682938 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.695165 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.695213 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.695223 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.695244 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.695256 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.702003 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.716207 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.797756 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.797795 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.797803 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.797818 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.797829 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.899979 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.900013 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.900022 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.900037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:22 crc kubenswrapper[4789]: I1216 06:51:22.900049 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:22Z","lastTransitionTime":"2025-12-16T06:51:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.003332 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.003368 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.003379 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.003393 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.003405 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.105414 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.105467 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.105478 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.105495 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.105507 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.209382 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.209807 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.209817 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.209833 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.209843 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.235391 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-wjmvq"] Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.235846 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.239091 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.239528 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.240110 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.242138 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.245701 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/902255f3-ae7f-4bce-bf64-b50fe8753a2b-host\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.245798 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfjp\" (UniqueName: \"kubernetes.io/projected/902255f3-ae7f-4bce-bf64-b50fe8753a2b-kube-api-access-lbfjp\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.245855 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/902255f3-ae7f-4bce-bf64-b50fe8753a2b-serviceca\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.252425 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.254626 4789 generic.go:334] "Generic (PLEG): container finished" podID="529ecdde-d194-4bf4-9e89-4accd6630349" containerID="61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297" exitCode=0 Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.254734 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerDied","Data":"61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.257516 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.257560 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.257577 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.257591 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.257654 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.259533 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.273330 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.293103 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.306125 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.316436 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.316467 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.316475 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.316488 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.316497 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.316859 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.341847 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.346365 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/902255f3-ae7f-4bce-bf64-b50fe8753a2b-serviceca\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.346422 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/902255f3-ae7f-4bce-bf64-b50fe8753a2b-host\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.346537 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfjp\" (UniqueName: \"kubernetes.io/projected/902255f3-ae7f-4bce-bf64-b50fe8753a2b-kube-api-access-lbfjp\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.346539 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/902255f3-ae7f-4bce-bf64-b50fe8753a2b-host\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.347723 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/902255f3-ae7f-4bce-bf64-b50fe8753a2b-serviceca\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.351639 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.364397 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.370448 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfjp\" (UniqueName: \"kubernetes.io/projected/902255f3-ae7f-4bce-bf64-b50fe8753a2b-kube-api-access-lbfjp\") pod \"node-ca-wjmvq\" (UID: \"902255f3-ae7f-4bce-bf64-b50fe8753a2b\") " pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.385951 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.401617 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.416265 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.419582 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.419633 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.419644 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.419658 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.419669 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.429086 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.444151 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.459206 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.472719 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.485433 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.499807 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.513968 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.521676 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.521707 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.521716 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.521730 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.521741 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.526960 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.539616 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.550463 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.569766 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wjmvq" Dec 16 06:51:23 crc kubenswrapper[4789]: W1216 06:51:23.579936 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod902255f3_ae7f_4bce_bf64_b50fe8753a2b.slice/crio-ee044a0e6505f8bb82bb1da7725018be56bbf3e8b03a2bbf224d1ae20c3c7c43 WatchSource:0}: Error finding container ee044a0e6505f8bb82bb1da7725018be56bbf3e8b03a2bbf224d1ae20c3c7c43: Status 404 returned error can't find the container with id ee044a0e6505f8bb82bb1da7725018be56bbf3e8b03a2bbf224d1ae20c3c7c43 Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.592979 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.623746 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.623792 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.623805 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.623827 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.623839 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.644127 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.671070 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.711735 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.727315 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.727368 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.727381 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.727401 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.727414 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.749234 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.749440 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:51:27.749416272 +0000 UTC m=+26.011303901 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.749521 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.749563 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.749693 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.749746 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:27.749734956 +0000 UTC m=+26.011622585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.750332 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.750519 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:27.750485835 +0000 UTC m=+26.012373464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.754689 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.796651 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.830266 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.830352 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.830376 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.830384 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.830397 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.830405 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.850781 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.850819 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.850926 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.850950 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.850965 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.851010 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:27.85099467 +0000 UTC m=+26.112882309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.850911 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.851029 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.851038 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:23 crc kubenswrapper[4789]: E1216 06:51:23.851064 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:27.85105648 +0000 UTC m=+26.112944109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.875870 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.910342 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:23Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.933548 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.933587 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.933605 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.933622 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:23 crc kubenswrapper[4789]: I1216 06:51:23.933634 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:23Z","lastTransitionTime":"2025-12-16T06:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.036240 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.036276 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.036287 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.036303 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.036314 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.103882 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.103955 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:24 crc kubenswrapper[4789]: E1216 06:51:24.104032 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.104067 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:24 crc kubenswrapper[4789]: E1216 06:51:24.104108 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:24 crc kubenswrapper[4789]: E1216 06:51:24.104533 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.138394 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.138435 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.138444 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.138460 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.138471 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.240819 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.240867 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.240883 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.240902 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.240933 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.266867 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.266936 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.268177 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wjmvq" event={"ID":"902255f3-ae7f-4bce-bf64-b50fe8753a2b","Type":"ContainerStarted","Data":"b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.268202 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wjmvq" event={"ID":"902255f3-ae7f-4bce-bf64-b50fe8753a2b","Type":"ContainerStarted","Data":"ee044a0e6505f8bb82bb1da7725018be56bbf3e8b03a2bbf224d1ae20c3c7c43"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.270396 4789 generic.go:334] "Generic (PLEG): container finished" podID="529ecdde-d194-4bf4-9e89-4accd6630349" containerID="aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a" exitCode=0 Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.270633 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerDied","Data":"aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.287872 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.302325 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.318490 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.334265 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.344070 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.344116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.344129 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.344147 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.344170 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.351434 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.366360 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.379148 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.404576 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.425461 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.438312 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.447343 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.447374 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.447382 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.447395 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.447405 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.502231 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.522324 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.537870 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.550154 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.550215 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.550231 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.550255 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.550269 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.552427 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.565810 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.576759 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.596338 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.631483 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.653085 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.653139 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.653150 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.653167 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.653182 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.674077 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.709714 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.754174 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.756709 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.756755 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.756768 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.756787 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.756800 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.792522 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.833269 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.859296 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.859325 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.859333 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.859348 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.859357 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.872040 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.914460 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.952093 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.962396 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.962448 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.962466 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.962488 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.962505 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:24Z","lastTransitionTime":"2025-12-16T06:51:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:24 crc kubenswrapper[4789]: I1216 06:51:24.994026 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:24Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.030130 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.065306 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.065358 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.065372 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.065391 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.065403 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.076798 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.117627 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.167806 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.167848 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.167859 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.167874 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.167887 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.270540 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.270589 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.270601 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.270618 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.270630 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.275590 4789 generic.go:334] "Generic (PLEG): container finished" podID="529ecdde-d194-4bf4-9e89-4accd6630349" containerID="7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66" exitCode=0 Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.275645 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerDied","Data":"7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.293862 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.311292 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.325469 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.348544 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.362781 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.372912 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.372976 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.372989 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.373007 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.373019 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.382163 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.402092 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.431340 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.473336 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.475371 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.475419 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.475435 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.475450 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.475460 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.515860 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.552594 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.577656 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.577699 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.577708 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.577724 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.577734 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.589991 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.639600 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.673544 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.680691 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.680763 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.680772 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.680788 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.680800 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.714655 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.782887 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.782930 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.782938 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.782954 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.782964 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.886692 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.886749 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.886762 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.886781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.886803 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.990087 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.990152 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.990169 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.990200 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:25 crc kubenswrapper[4789]: I1216 06:51:25.990218 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:25Z","lastTransitionTime":"2025-12-16T06:51:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.093878 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.093956 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.093976 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.094001 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.094018 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.104721 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.104721 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:26 crc kubenswrapper[4789]: E1216 06:51:26.104871 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.104725 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:26 crc kubenswrapper[4789]: E1216 06:51:26.104987 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:26 crc kubenswrapper[4789]: E1216 06:51:26.105156 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.197140 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.197219 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.197237 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.197266 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.197286 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.285527 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.288603 4789 generic.go:334] "Generic (PLEG): container finished" podID="529ecdde-d194-4bf4-9e89-4accd6630349" containerID="642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70" exitCode=0 Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.288645 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerDied","Data":"642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.299471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.299538 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.299553 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.299575 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.299589 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.314087 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.331317 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.349899 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.370097 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.384053 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.397471 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.402445 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.402484 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.402493 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.402508 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.402527 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.415365 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.427208 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.439284 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.450154 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.461763 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.475437 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.485531 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.496773 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.504041 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.504083 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.504094 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.504109 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.504121 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.511204 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.606108 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.606152 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.606160 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.606176 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.606185 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.708853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.708956 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.708974 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.708999 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.709016 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.811362 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.811868 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.811889 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.811964 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.811984 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.914900 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.915006 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.915032 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.915061 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:26 crc kubenswrapper[4789]: I1216 06:51:26.915083 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:26Z","lastTransitionTime":"2025-12-16T06:51:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.018696 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.018732 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.018742 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.018759 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.018770 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.121028 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.121079 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.121093 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.121113 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.121128 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.224247 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.224278 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.224288 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.224304 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.224329 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.294584 4789 generic.go:334] "Generic (PLEG): container finished" podID="529ecdde-d194-4bf4-9e89-4accd6630349" containerID="2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb" exitCode=0 Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.294649 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerDied","Data":"2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.310846 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.330773 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.331588 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.331623 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.331637 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.331659 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.331670 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.345181 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.362008 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.374321 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.388358 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.404416 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.421938 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.435015 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.435049 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.435059 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.435074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.435085 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.435310 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.457203 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.472449 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.489624 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.502491 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.519359 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.530809 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.537329 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.537363 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.537372 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.537387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.537396 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.639489 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.639526 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.639534 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.639549 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.639559 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.742420 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.742456 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.742467 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.742481 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.742491 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.812000 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.812089 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.812159 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.812205 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:51:35.812185075 +0000 UTC m=+34.074072704 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.812242 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.812278 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.812299 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:35.812286166 +0000 UTC m=+34.074173815 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.812420 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:35.812400047 +0000 UTC m=+34.074287696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.844474 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.844509 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.844520 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.844532 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.844542 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.912423 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.912479 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.912594 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.912590 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.912636 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.912648 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.912697 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:35.912680819 +0000 UTC m=+34.174568448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.912612 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.913091 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:27 crc kubenswrapper[4789]: E1216 06:51:27.913172 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:35.913153854 +0000 UTC m=+34.175041483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.946630 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.946665 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.946674 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.946690 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:27 crc kubenswrapper[4789]: I1216 06:51:27.946701 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:27Z","lastTransitionTime":"2025-12-16T06:51:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.051083 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.051139 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.051163 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.051189 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.051206 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.104672 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.104747 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.104702 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:28 crc kubenswrapper[4789]: E1216 06:51:28.104831 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:28 crc kubenswrapper[4789]: E1216 06:51:28.104980 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:28 crc kubenswrapper[4789]: E1216 06:51:28.105055 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.153885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.153933 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.153941 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.153953 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.153962 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.256804 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.256837 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.256864 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.256879 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.256888 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.308878 4789 generic.go:334] "Generic (PLEG): container finished" podID="529ecdde-d194-4bf4-9e89-4accd6630349" containerID="4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552" exitCode=0 Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.308962 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerDied","Data":"4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.315186 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.315627 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.315656 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.344306 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.345562 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.351371 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.361546 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.361600 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.361615 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.361799 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.361820 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.372229 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.383902 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.406451 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.416397 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.429368 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.441668 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.453957 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.464074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.464133 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.464151 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.464174 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.464191 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.470338 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.481654 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.493215 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.510244 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.527904 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.543183 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.554295 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.564392 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.566233 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.566271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.566282 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.566300 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.566312 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.574626 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.592384 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.603072 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.614741 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.627058 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.636568 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.655363 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.664986 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.668028 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.668056 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.668064 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.668078 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.668087 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.676758 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.689691 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.700095 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.710387 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.720053 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.735967 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:28Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.769656 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.769691 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.769699 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.769712 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.769722 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.872371 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.872433 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.872443 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.872458 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.872467 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.975260 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.975306 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.975317 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.975334 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:28 crc kubenswrapper[4789]: I1216 06:51:28.975346 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:28Z","lastTransitionTime":"2025-12-16T06:51:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.078557 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.078630 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.078654 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.078684 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.078705 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.181734 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.181797 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.181815 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.181845 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.181868 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.285468 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.285568 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.285592 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.285622 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.285639 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.330199 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" event={"ID":"529ecdde-d194-4bf4-9e89-4accd6630349","Type":"ContainerStarted","Data":"3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.330342 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.358560 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.374112 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.388720 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.388756 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.388766 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.388904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.388938 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.389991 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.460432 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.475429 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.488467 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.491048 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.491077 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.491085 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.491097 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.491107 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.520144 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.536460 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.560785 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.582056 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.593136 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.593179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.593191 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.593209 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.593221 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.598448 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.616161 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.633685 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.644295 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.652970 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:29Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.695421 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.695460 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.695471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.695487 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.695498 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.797941 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.797980 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.797992 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.798008 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.798020 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.900114 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.900154 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.900164 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.900183 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:29 crc kubenswrapper[4789]: I1216 06:51:29.900193 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:29Z","lastTransitionTime":"2025-12-16T06:51:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.005853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.005967 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.005994 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.006040 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.006081 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.104432 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.104449 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.104550 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.104643 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.104888 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.104967 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.109513 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.109534 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.109543 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.109556 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.109566 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.211622 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.211666 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.211680 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.211698 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.211709 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.315434 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.315485 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.315505 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.315539 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.315560 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.334200 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.418950 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.419061 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.419092 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.419121 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.419169 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.523406 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.523471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.523486 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.523507 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.523521 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.626592 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.626634 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.626664 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.626679 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.626688 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.731249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.731337 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.731357 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.731386 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.731408 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.814108 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.814190 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.814202 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.814221 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.814296 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.838552 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.844701 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.844761 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.844781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.844806 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.844824 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.892455 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.898052 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.898121 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.898146 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.898175 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.898194 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.918560 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.924045 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.924098 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.924110 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.924129 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.924146 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.945718 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.952295 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.952337 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.952351 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.952372 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.952387 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.974883 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:30Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:30 crc kubenswrapper[4789]: E1216 06:51:30.975178 4789 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.978387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.978440 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.978453 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.978718 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:30 crc kubenswrapper[4789]: I1216 06:51:30.978737 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:30Z","lastTransitionTime":"2025-12-16T06:51:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.082711 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.083580 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.083802 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.084009 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.084184 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.187780 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.187885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.187943 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.188032 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.188073 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.290307 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.290342 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.290353 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.290369 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.290381 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.338193 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/0.log" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.340588 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0" exitCode=1 Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.340622 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.341160 4789 scope.go:117] "RemoveContainer" containerID="1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.373215 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.386745 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.391851 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.391889 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.391905 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.391949 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.391966 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.403211 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.428378 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.440776 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.457225 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.469263 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.478023 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.488548 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.494109 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.494165 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.494177 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.494204 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.494217 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.503783 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.518493 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.536395 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.552369 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.564795 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.578321 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:31Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.597410 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.597476 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.597487 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.597500 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.597510 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.701038 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.704972 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.705013 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.705032 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.705050 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.807017 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.807053 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.807061 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.807074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.807100 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.909949 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.910015 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.910026 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.910042 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:31 crc kubenswrapper[4789]: I1216 06:51:31.910072 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:31Z","lastTransitionTime":"2025-12-16T06:51:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.012806 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.012874 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.012890 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.012939 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.012961 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.104718 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:32 crc kubenswrapper[4789]: E1216 06:51:32.104853 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.104987 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:32 crc kubenswrapper[4789]: E1216 06:51:32.105116 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.107996 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:32 crc kubenswrapper[4789]: E1216 06:51:32.108080 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.116689 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.116739 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.116754 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.116776 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.116793 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.118773 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.130535 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.149273 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.161970 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.176365 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.208766 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.221237 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.221308 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.221339 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.221374 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.221397 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.229228 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.246841 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.265241 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.280052 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.296790 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.312526 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.325506 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.325565 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.325579 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.325600 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.325613 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.332388 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.346887 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/0.log" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.351649 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.351889 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.353159 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.371009 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.397963 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.418369 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.430227 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.430320 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.430337 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.430367 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.430388 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.438998 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.458885 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.477727 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.496665 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.524617 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.533547 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.533621 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.533641 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.533701 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.533725 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.540461 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.568624 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.587639 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.607961 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.625831 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.637116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.637173 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.637190 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.637210 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.637222 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.639399 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.653640 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.666129 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.739580 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.739632 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.739644 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.739662 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.739675 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.842418 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.842473 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.842483 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.842499 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.842509 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.944995 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.945037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.945049 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.945065 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.945077 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:32Z","lastTransitionTime":"2025-12-16T06:51:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.988271 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m"] Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.988833 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.992151 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 06:51:32 crc kubenswrapper[4789]: I1216 06:51:32.992432 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.015412 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.035078 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.047811 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.047849 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.047860 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.047875 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.047888 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.052819 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.063472 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlmmz\" (UniqueName: \"kubernetes.io/projected/b6cbf639-2df9-4d83-965e-148cb7787b12-kube-api-access-mlmmz\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.063552 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6cbf639-2df9-4d83-965e-148cb7787b12-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.063607 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6cbf639-2df9-4d83-965e-148cb7787b12-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.063644 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6cbf639-2df9-4d83-965e-148cb7787b12-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.068317 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.086495 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.098673 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.118853 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.132628 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.145585 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.150030 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.150056 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.150064 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.150077 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.150087 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.160653 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.164567 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlmmz\" (UniqueName: \"kubernetes.io/projected/b6cbf639-2df9-4d83-965e-148cb7787b12-kube-api-access-mlmmz\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.164645 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6cbf639-2df9-4d83-965e-148cb7787b12-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.164684 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6cbf639-2df9-4d83-965e-148cb7787b12-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.164714 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6cbf639-2df9-4d83-965e-148cb7787b12-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.165556 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6cbf639-2df9-4d83-965e-148cb7787b12-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.165563 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6cbf639-2df9-4d83-965e-148cb7787b12-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.175476 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6cbf639-2df9-4d83-965e-148cb7787b12-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.181412 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlmmz\" (UniqueName: \"kubernetes.io/projected/b6cbf639-2df9-4d83-965e-148cb7787b12-kube-api-access-mlmmz\") pod \"ovnkube-control-plane-749d76644c-x8b7m\" (UID: \"b6cbf639-2df9-4d83-965e-148cb7787b12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.182156 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.196570 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.207580 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.217788 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.227460 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.237284 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.252743 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.252811 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.252822 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.252836 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.252848 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.301018 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.355064 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.355095 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.355104 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.355116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.355125 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.360404 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/1.log" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.360977 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/0.log" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.363508 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22" exitCode=1 Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.363558 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.363600 4789 scope.go:117] "RemoveContainer" containerID="1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.364686 4789 scope.go:117] "RemoveContainer" containerID="66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22" Dec 16 06:51:33 crc kubenswrapper[4789]: E1216 06:51:33.364825 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.365606 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" event={"ID":"b6cbf639-2df9-4d83-965e-148cb7787b12","Type":"ContainerStarted","Data":"f27dfb8bbd3eb245202cd929a635f9486928995035edff5c1888e142db79b033"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.375702 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.399124 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.411146 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.424221 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.433651 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.447150 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.457365 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.457667 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.457691 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.457701 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.457716 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.457725 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.465950 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.492559 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.505356 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.516883 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.531068 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.544969 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.555800 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.559277 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.559315 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.559327 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.559344 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.559355 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.567448 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.579775 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:33Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.661954 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.662007 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.662018 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.662037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.662049 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.763775 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.763804 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.763811 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.763824 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.763833 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.865984 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.866013 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.866023 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.866037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.866052 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.970248 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.970294 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.970317 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.970344 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:33 crc kubenswrapper[4789]: I1216 06:51:33.970356 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:33Z","lastTransitionTime":"2025-12-16T06:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.072999 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.073036 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.073044 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.073058 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.073068 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.104538 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.104547 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:34 crc kubenswrapper[4789]: E1216 06:51:34.104718 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.104547 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:34 crc kubenswrapper[4789]: E1216 06:51:34.104812 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:34 crc kubenswrapper[4789]: E1216 06:51:34.104829 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.176417 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.176451 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.176462 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.176477 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.176489 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.278879 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.278942 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.279074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.279099 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.279113 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.369992 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/1.log" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.374467 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" event={"ID":"b6cbf639-2df9-4d83-965e-148cb7787b12","Type":"ContainerStarted","Data":"0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.374522 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" event={"ID":"b6cbf639-2df9-4d83-965e-148cb7787b12","Type":"ContainerStarted","Data":"1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.380760 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.380833 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.380853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.380878 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.380900 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.397434 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.411625 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.430987 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.451980 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.466022 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.476486 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.483011 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.483051 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.483061 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.483075 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.483086 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.495136 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.506598 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.519903 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.531274 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.542651 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.554211 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.565071 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.575530 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.585452 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.585491 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.585500 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.585514 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.585524 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.588148 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.598792 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.690721 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.690905 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.690964 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.690997 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.691031 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.795404 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.795454 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.795466 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.795485 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.795498 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.813504 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ttcm5"] Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.814097 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:34 crc kubenswrapper[4789]: E1216 06:51:34.814176 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.827346 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.838331 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.873680 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.879510 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.879678 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c7br\" (UniqueName: \"kubernetes.io/projected/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-kube-api-access-8c7br\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.897735 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.897769 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.897777 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.897790 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.897799 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:34Z","lastTransitionTime":"2025-12-16T06:51:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.907002 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.922541 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.942233 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.955644 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.967252 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.980950 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c7br\" (UniqueName: \"kubernetes.io/projected/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-kube-api-access-8c7br\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.981056 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:34 crc kubenswrapper[4789]: E1216 06:51:34.981231 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:34 crc kubenswrapper[4789]: E1216 06:51:34.981305 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:35.481284106 +0000 UTC m=+33.743171745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.983193 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.998396 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:34Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:34 crc kubenswrapper[4789]: I1216 06:51:34.998944 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c7br\" (UniqueName: \"kubernetes.io/projected/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-kube-api-access-8c7br\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.000167 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.000194 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.000203 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.000219 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.000229 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.013652 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.036721 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.050770 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.060136 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.075978 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.086686 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.098982 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:35Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.102829 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.102877 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.102892 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.102922 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.102936 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.206120 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.206212 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.206236 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.206271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.206300 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.310147 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.310197 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.310206 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.310225 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.310238 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.413174 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.413217 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.413228 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.413244 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.413256 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.487686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.488421 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.488473 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:36.488458935 +0000 UTC m=+34.750346554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.516321 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.516372 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.516385 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.516407 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.516419 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.619062 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.619137 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.619147 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.619183 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.619196 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.721542 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.721841 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.721855 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.721885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.721898 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.824523 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.824566 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.824577 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.824595 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.824607 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.893375 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.893469 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.893538 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.893644 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.893708 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:51.893692475 +0000 UTC m=+50.155580104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.893762 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.893850 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:51.893829229 +0000 UTC m=+50.155716868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.893987 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:51:51.893858119 +0000 UTC m=+50.155745788 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.926340 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.926384 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.926394 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.926410 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.926422 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:35Z","lastTransitionTime":"2025-12-16T06:51:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.994710 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:35 crc kubenswrapper[4789]: I1216 06:51:35.994772 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.994963 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.994988 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.995003 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.995071 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:51.995049499 +0000 UTC m=+50.256937138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.994964 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.995107 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.995119 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:35 crc kubenswrapper[4789]: E1216 06:51:35.995154 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:51.995142461 +0000 UTC m=+50.257030100 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.029620 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.029666 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.029675 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.029694 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.029705 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.104399 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:36 crc kubenswrapper[4789]: E1216 06:51:36.104519 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.104744 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:36 crc kubenswrapper[4789]: E1216 06:51:36.104817 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.104868 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:36 crc kubenswrapper[4789]: E1216 06:51:36.104928 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.131733 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.131758 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.131770 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.131785 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.131796 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.235179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.235260 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.235283 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.235317 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.235342 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.338291 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.338378 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.338391 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.338415 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.338433 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.440496 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.440546 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.440556 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.440572 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.440582 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.500540 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:36 crc kubenswrapper[4789]: E1216 06:51:36.500699 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:36 crc kubenswrapper[4789]: E1216 06:51:36.500790 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:38.500771771 +0000 UTC m=+36.762659400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.548380 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.548419 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.548429 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.548444 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.548454 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.650995 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.651035 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.651043 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.651056 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.651065 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.754166 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.754212 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.754223 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.754241 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.754255 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.856491 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.856533 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.856545 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.856561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.856574 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.959516 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.959580 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.959598 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.959623 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:36 crc kubenswrapper[4789]: I1216 06:51:36.959640 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:36Z","lastTransitionTime":"2025-12-16T06:51:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.062845 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.062882 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.062891 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.062905 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.062928 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.104441 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:37 crc kubenswrapper[4789]: E1216 06:51:37.104631 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.164902 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.165025 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.165049 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.165077 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.165098 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.268002 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.268075 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.268095 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.268120 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.268145 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.371504 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.371567 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.371584 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.371614 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.371633 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.474620 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.474722 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.474746 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.474776 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.474797 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.578191 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.578265 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.578290 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.578319 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.578336 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.681496 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.681571 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.681600 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.681632 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.681657 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.785239 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.785293 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.785354 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.785386 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.785409 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.887953 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.887997 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.888006 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.888020 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.888031 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.990385 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.990421 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.990433 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.990451 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:37 crc kubenswrapper[4789]: I1216 06:51:37.990463 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:37Z","lastTransitionTime":"2025-12-16T06:51:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.093282 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.093332 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.093347 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.093367 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.093380 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.104654 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:38 crc kubenswrapper[4789]: E1216 06:51:38.104783 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.105142 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.105238 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:38 crc kubenswrapper[4789]: E1216 06:51:38.105529 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:38 crc kubenswrapper[4789]: E1216 06:51:38.105421 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.197206 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.197325 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.197351 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.197384 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.197422 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.300858 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.300980 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.301009 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.301038 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.301061 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.403610 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.403669 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.403689 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.403715 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.403732 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.506904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.507021 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.507042 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.507066 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.507083 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.520710 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:38 crc kubenswrapper[4789]: E1216 06:51:38.520999 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:38 crc kubenswrapper[4789]: E1216 06:51:38.521114 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:42.521086873 +0000 UTC m=+40.782974532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.544047 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.569203 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.592703 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.609689 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.609758 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.609781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.609811 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.609833 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.612389 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.623887 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.641401 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.666075 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.686413 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.701439 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.712804 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.712856 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.712872 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.712893 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.712941 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.714839 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.726496 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.743002 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.768304 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.782829 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.794141 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.814998 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.815037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.815048 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.815065 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.815076 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.816069 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1179f93c1a6fa73cbdc0cdcd5d1d43b816f9365ebead995358f3e433d9c753b0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:30Z\\\",\\\"message\\\":\\\".475602 6068 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.475731 6068 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:30.476125 6068 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 06:51:30.476237 6068 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 06:51:30.476272 6068 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 06:51:30.476317 6068 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:30.476327 6068 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 06:51:30.476378 6068 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 06:51:30.476414 6068 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 06:51:30.476490 6068 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 06:51:30.476542 6068 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 06:51:30.476614 6068 factory.go:656] Stopping watch factory\\\\nI1216 06:51:30.476655 6068 ovnkube.go:599] Stopped ovnkube\\\\nI1216 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.829710 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.844637 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:38Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.917413 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.917472 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.917522 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.917550 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:38 crc kubenswrapper[4789]: I1216 06:51:38.917570 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:38Z","lastTransitionTime":"2025-12-16T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.020349 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.020390 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.020401 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.020416 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.020427 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.104610 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:39 crc kubenswrapper[4789]: E1216 06:51:39.104859 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.122690 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.122738 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.122752 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.122770 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.122783 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.225471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.225507 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.225521 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.225538 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.225553 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.328590 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.328636 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.328647 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.328663 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.328674 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.431409 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.431469 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.431535 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.431572 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.431585 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.533806 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.533871 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.533883 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.533900 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.533932 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.637782 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.637853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.637875 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.637904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.637961 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.740206 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.740258 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.740271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.740289 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.740302 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.842030 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.842086 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.842098 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.842117 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.842131 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.945328 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.945387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.945415 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.945437 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:39 crc kubenswrapper[4789]: I1216 06:51:39.945449 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:39Z","lastTransitionTime":"2025-12-16T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.048511 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.048584 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.048606 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.048634 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.048653 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.075760 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.077304 4789 scope.go:117] "RemoveContainer" containerID="66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22" Dec 16 06:51:40 crc kubenswrapper[4789]: E1216 06:51:40.077629 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.091877 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.103514 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.104643 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:40 crc kubenswrapper[4789]: E1216 06:51:40.104774 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.104842 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.104867 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:40 crc kubenswrapper[4789]: E1216 06:51:40.105408 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:40 crc kubenswrapper[4789]: E1216 06:51:40.105520 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.119120 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.131208 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.148880 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.151537 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.151566 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.151575 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.151591 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.151601 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.163559 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.181233 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.192745 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.207263 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.231281 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.244981 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.253316 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.253341 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.253353 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.253368 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.253379 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.257874 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.282051 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.292370 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.306488 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.317082 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.326016 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:40Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.355639 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.355681 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.355695 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.355715 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.355731 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.458250 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.458316 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.458339 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.458366 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.458388 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.560714 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.560757 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.560772 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.560792 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.560809 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.663460 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.663511 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.663526 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.663546 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.663563 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.766227 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.766268 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.766277 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.766292 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.766319 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.868460 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.868514 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.868527 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.868550 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.868565 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.971295 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.971342 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.971356 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.971375 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:40 crc kubenswrapper[4789]: I1216 06:51:40.971389 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:40Z","lastTransitionTime":"2025-12-16T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.075112 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.075204 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.075230 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.075264 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.075288 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.093355 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.093412 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.093429 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.093455 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.093474 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.104495 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:41 crc kubenswrapper[4789]: E1216 06:51:41.104664 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:41 crc kubenswrapper[4789]: E1216 06:51:41.112976 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.119013 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.119078 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.119097 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.119123 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.119140 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: E1216 06:51:41.141833 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.146628 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.147457 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.147495 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.147516 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.147528 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: E1216 06:51:41.168065 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.171959 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.172111 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.172229 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.172356 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.172478 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: E1216 06:51:41.187165 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.190876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.191057 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.191177 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.191303 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.191434 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: E1216 06:51:41.204759 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:41Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:41 crc kubenswrapper[4789]: E1216 06:51:41.204869 4789 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.206464 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.206492 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.206504 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.206519 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.206530 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.308499 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.308534 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.308542 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.308556 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.308565 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.410685 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.410730 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.410744 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.410760 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.410772 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.512898 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.512943 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.512954 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.512966 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.512975 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.614877 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.614949 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.614966 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.614984 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.614997 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.716751 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.716781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.716790 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.716803 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.716811 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.819181 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.819227 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.819238 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.819256 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.819267 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.921754 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.921835 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.921850 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.921865 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:41 crc kubenswrapper[4789]: I1216 06:51:41.921875 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:41Z","lastTransitionTime":"2025-12-16T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.024668 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.024730 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.024740 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.024758 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.024770 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.104797 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.105054 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:42 crc kubenswrapper[4789]: E1216 06:51:42.105106 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:42 crc kubenswrapper[4789]: E1216 06:51:42.105306 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.105455 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:42 crc kubenswrapper[4789]: E1216 06:51:42.105650 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.128183 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.129015 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.129049 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.129059 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.129074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.129086 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.141225 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.152478 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.181552 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.195839 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.211425 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.225676 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.231945 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.232004 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.232018 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.232045 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.232060 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.247326 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.267024 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.286586 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.305064 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.322087 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.335462 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.335514 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.335529 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.335550 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.335565 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.335541 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.348118 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.369010 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.381105 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.392815 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:42Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.439869 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.439964 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.439979 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.440009 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.440022 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.543251 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.543305 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.543316 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.543335 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.543347 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.565084 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:42 crc kubenswrapper[4789]: E1216 06:51:42.565296 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:42 crc kubenswrapper[4789]: E1216 06:51:42.565377 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:51:50.5653491 +0000 UTC m=+48.827236759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.645868 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.645906 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.645928 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.645941 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.645968 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.748159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.748190 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.748198 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.748210 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.748220 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.850090 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.850127 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.850135 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.850151 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.850164 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.953217 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.953300 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.953328 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.953362 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:42 crc kubenswrapper[4789]: I1216 06:51:42.953386 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:42Z","lastTransitionTime":"2025-12-16T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.057295 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.057378 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.057400 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.057426 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.057445 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.105033 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:43 crc kubenswrapper[4789]: E1216 06:51:43.105239 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.161516 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.161598 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.161623 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.161663 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.161688 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.265893 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.266038 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.266048 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.266067 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.266078 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.369819 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.369878 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.369894 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.369936 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.369949 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.472964 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.473041 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.473062 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.473088 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.473106 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.576896 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.576973 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.576988 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.577006 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.577021 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.680580 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.680686 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.680707 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.680737 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.680761 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.782979 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.783037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.783054 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.783086 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.783104 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.886248 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.886300 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.886315 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.886335 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.886349 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.989449 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.989510 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.989523 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.989539 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:43 crc kubenswrapper[4789]: I1216 06:51:43.989551 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:43Z","lastTransitionTime":"2025-12-16T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.092297 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.092341 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.092357 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.092377 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.092391 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.105274 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.105309 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.105398 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:44 crc kubenswrapper[4789]: E1216 06:51:44.105464 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:44 crc kubenswrapper[4789]: E1216 06:51:44.105721 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:44 crc kubenswrapper[4789]: E1216 06:51:44.105643 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.195717 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.195750 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.195763 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.195781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.195794 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.298052 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.298098 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.298111 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.298128 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.298140 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.400219 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.400246 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.400254 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.400267 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.400276 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.502494 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.502532 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.502544 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.502558 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.502568 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.604874 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.604943 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.604963 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.604978 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.604987 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.707359 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.707418 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.707430 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.707446 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.707456 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.810360 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.810395 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.810402 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.810417 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.810443 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.912016 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.912055 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.912067 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.912083 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:44 crc kubenswrapper[4789]: I1216 06:51:44.912096 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:44Z","lastTransitionTime":"2025-12-16T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.014624 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.014662 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.014672 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.014687 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.014698 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.104362 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:45 crc kubenswrapper[4789]: E1216 06:51:45.104479 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.116397 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.116426 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.116434 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.116447 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.116456 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.221318 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.221401 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.221429 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.221484 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.221504 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.323744 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.324100 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.324228 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.324360 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.324474 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.426837 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.426886 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.426896 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.426937 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.426952 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.529469 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.529507 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.529516 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.529531 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.529540 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.633508 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.634086 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.634111 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.634133 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.634146 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.737203 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.737461 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.737637 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.737794 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.738106 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.842482 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.842524 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.842536 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.842555 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.842568 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.944824 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.944858 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.944870 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.944885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:45 crc kubenswrapper[4789]: I1216 06:51:45.944896 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:45Z","lastTransitionTime":"2025-12-16T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.046631 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.046674 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.046704 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.046722 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.046732 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.103956 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:46 crc kubenswrapper[4789]: E1216 06:51:46.104090 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.104111 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:46 crc kubenswrapper[4789]: E1216 06:51:46.104174 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.104213 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:46 crc kubenswrapper[4789]: E1216 06:51:46.104293 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.148879 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.148938 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.148950 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.148966 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.148974 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.250689 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.250740 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.250749 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.250761 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.250771 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.352575 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.352617 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.352629 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.352673 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.352687 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.455711 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.455776 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.455812 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.455842 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.455866 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.559009 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.559066 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.559074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.559089 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.559099 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.661469 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.661523 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.661539 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.661561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.661578 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.765067 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.765127 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.765139 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.765155 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.765165 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.867859 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.867994 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.868021 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.868052 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.868074 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.972158 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.972228 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.972244 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.972272 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:46 crc kubenswrapper[4789]: I1216 06:51:46.972292 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:46Z","lastTransitionTime":"2025-12-16T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.076360 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.076423 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.076442 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.076471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.076494 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.103994 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:47 crc kubenswrapper[4789]: E1216 06:51:47.104196 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.179162 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.179211 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.179224 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.179244 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.179262 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.282676 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.282728 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.282740 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.282759 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.282771 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.385714 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.385765 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.385779 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.385802 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.385820 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.488560 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.488598 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.488610 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.488625 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.488636 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.596505 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.596568 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.596582 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.596619 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.596634 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.698840 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.698868 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.698876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.698890 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.698900 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.800993 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.801047 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.801060 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.801079 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.801091 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.903978 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.904035 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.904051 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.904069 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:47 crc kubenswrapper[4789]: I1216 06:51:47.904085 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:47Z","lastTransitionTime":"2025-12-16T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.006971 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.007013 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.007024 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.007043 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.007059 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.104817 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.104903 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:48 crc kubenswrapper[4789]: E1216 06:51:48.104995 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.105056 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:48 crc kubenswrapper[4789]: E1216 06:51:48.105199 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:48 crc kubenswrapper[4789]: E1216 06:51:48.105342 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.110754 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.110815 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.110828 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.110857 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.110873 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.213999 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.214055 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.214068 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.214093 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.214108 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.317306 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.317372 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.317389 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.317422 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.317462 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.420556 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.420622 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.420661 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.420691 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.420713 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.523981 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.524016 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.524029 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.524048 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.524060 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.625943 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.625993 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.626007 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.626029 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.626043 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.728796 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.728846 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.728858 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.728874 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.728886 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.831179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.831217 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.831227 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.831285 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.831301 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.934196 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.934258 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.934273 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.934297 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:48 crc kubenswrapper[4789]: I1216 06:51:48.934311 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:48Z","lastTransitionTime":"2025-12-16T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.037708 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.037782 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.037805 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.037834 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.037856 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.103959 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:49 crc kubenswrapper[4789]: E1216 06:51:49.104218 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.140774 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.140825 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.140833 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.140850 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.140860 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.244038 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.244127 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.244154 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.244190 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.244216 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.347280 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.347317 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.347325 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.347339 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.347352 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.449328 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.449366 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.449376 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.449392 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.449404 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.552146 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.552190 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.552207 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.552228 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.552246 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.655051 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.655127 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.655152 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.655182 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.655204 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.759435 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.759514 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.759537 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.759568 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.759601 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.862723 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.862783 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.862800 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.862827 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.862845 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.965700 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.965764 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.965773 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.965787 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:49 crc kubenswrapper[4789]: I1216 06:51:49.965798 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:49Z","lastTransitionTime":"2025-12-16T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.068817 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.069045 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.069282 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.069539 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.069728 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.104367 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:50 crc kubenswrapper[4789]: E1216 06:51:50.104529 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.104802 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.104965 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:50 crc kubenswrapper[4789]: E1216 06:51:50.105271 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:50 crc kubenswrapper[4789]: E1216 06:51:50.105412 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.172749 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.172815 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.172838 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.172864 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.172886 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.276775 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.276801 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.276809 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.276822 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.276831 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.380953 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.380990 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.381002 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.381018 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.381028 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.483848 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.484297 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.484458 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.484600 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.484719 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.587617 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.587658 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.587668 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.587682 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.587693 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.650669 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:50 crc kubenswrapper[4789]: E1216 06:51:50.650867 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:50 crc kubenswrapper[4789]: E1216 06:51:50.650990 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:52:06.650964529 +0000 UTC m=+64.912852198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.690572 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.690631 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.690648 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.690671 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.690687 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.794951 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.795048 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.795073 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.795103 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.795125 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.898303 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.898381 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.898399 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.898449 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:50 crc kubenswrapper[4789]: I1216 06:51:50.898472 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:50Z","lastTransitionTime":"2025-12-16T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.001692 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.001790 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.001811 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.001837 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.001855 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.104810 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.104971 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.105603 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.105677 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.105706 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.105737 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.105761 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.209616 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.209708 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.209870 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.210002 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.210020 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.287528 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.287583 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.287596 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.287614 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.287627 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.303350 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.308352 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.308415 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.308434 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.308459 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.308477 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.323454 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.328179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.328293 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.328307 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.328324 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.328337 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.340150 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.351451 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.351540 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.351561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.351590 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.351609 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.365595 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.370216 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.370273 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.370288 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.370314 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.370332 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.382191 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:51Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.382352 4789 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.383970 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.383999 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.384008 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.384022 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.384034 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.486295 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.486376 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.486390 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.486408 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.486420 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.590287 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.590330 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.590342 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.590357 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.590367 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.693409 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.693449 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.693463 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.693482 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.693496 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.795829 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.795868 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.795878 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.795892 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.795926 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.898388 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.898458 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.898475 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.898495 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.898509 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:51Z","lastTransitionTime":"2025-12-16T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.968987 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.969202 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.969274 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:52:23.969233383 +0000 UTC m=+82.231121052 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.969326 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:51 crc kubenswrapper[4789]: I1216 06:51:51.969337 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.969401 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:52:23.969378527 +0000 UTC m=+82.231266196 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.969501 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:51 crc kubenswrapper[4789]: E1216 06:51:51.969592 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:52:23.969570941 +0000 UTC m=+82.231458660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.000386 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.000430 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.000441 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.000459 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.000473 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.070770 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.070826 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.070980 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.071001 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.071013 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.071070 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:52:24.071053189 +0000 UTC m=+82.332940818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.071082 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.071130 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.071147 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.071226 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:52:24.071201363 +0000 UTC m=+82.333089032 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.103128 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.103188 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.103207 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.103229 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.103245 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.104250 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.104256 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.104264 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.104460 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.104535 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:52 crc kubenswrapper[4789]: E1216 06:51:52.104354 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.114126 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.134938 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.145973 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.170452 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.187014 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.205991 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.206081 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.206094 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.206112 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.206124 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.209652 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.228879 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.240742 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.253524 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.267906 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.282417 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.298900 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.309603 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.309660 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.309674 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.309692 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.309713 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.315529 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.328055 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.343767 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.362693 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.375417 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:52Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.412363 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.412426 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.412440 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.412462 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.412477 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.515131 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.515165 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.515178 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.515193 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.515206 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.617185 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.617249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.617268 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.617294 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.617312 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.719632 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.719698 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.719713 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.719731 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.719742 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.822744 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.822775 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.822783 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.822796 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.822806 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.926328 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.926387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.926405 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.926428 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:52 crc kubenswrapper[4789]: I1216 06:51:52.926445 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:52Z","lastTransitionTime":"2025-12-16T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.029016 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.029082 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.029099 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.029126 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.029143 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.103954 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:53 crc kubenswrapper[4789]: E1216 06:51:53.104192 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.132381 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.132426 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.132439 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.132456 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.132468 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.234766 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.234813 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.234824 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.234843 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.234854 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.338068 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.338122 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.338134 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.338151 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.338168 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.441408 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.441462 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.441477 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.441500 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.441516 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.544084 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.544270 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.544290 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.544314 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.544333 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.646953 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.647028 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.647053 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.647091 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.647112 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.749959 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.750037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.750069 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.750103 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.750127 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.853159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.853204 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.853221 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.853240 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.853254 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.963261 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.963340 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.963382 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.963415 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:53 crc kubenswrapper[4789]: I1216 06:51:53.963441 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:53Z","lastTransitionTime":"2025-12-16T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.065852 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.065883 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.065891 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.065904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.065942 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.104538 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.104590 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.104619 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:54 crc kubenswrapper[4789]: E1216 06:51:54.104731 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:54 crc kubenswrapper[4789]: E1216 06:51:54.104959 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:54 crc kubenswrapper[4789]: E1216 06:51:54.105152 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.168745 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.168788 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.168799 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.168815 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.168826 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.271653 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.271703 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.271714 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.271733 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.271745 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.374614 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.374661 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.374672 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.374688 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.374698 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.477996 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.478067 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.478084 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.478534 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.478599 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.581225 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.581304 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.581314 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.581333 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.581345 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.683392 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.683518 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.683545 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.683572 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.683638 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.726317 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.734867 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.739183 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.750762 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.760571 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.773832 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.785464 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.785503 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.785513 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.785530 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.785479 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.785541 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.796417 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.806068 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.815453 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.825896 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.843033 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.855686 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.867944 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.890162 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.890213 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.890228 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.890249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.890264 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.890480 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.901411 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.915808 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.927598 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.937701 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:54Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.992179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.992222 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.992232 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.992249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:54 crc kubenswrapper[4789]: I1216 06:51:54.992260 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:54Z","lastTransitionTime":"2025-12-16T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.095354 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.095408 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.095418 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.095436 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.095446 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.104209 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:55 crc kubenswrapper[4789]: E1216 06:51:55.104618 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.105008 4789 scope.go:117] "RemoveContainer" containerID="66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.197712 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.197862 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.197971 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.198062 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.198128 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.300815 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.300849 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.300858 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.300873 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.300882 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.402751 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.402789 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.402800 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.402813 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.402822 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.452285 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/1.log" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.454978 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.474553 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.488773 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.500199 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.504417 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.504444 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.504453 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.504465 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.504474 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.512923 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.525174 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.537208 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.554678 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.565540 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.575871 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.589490 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.604001 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.607175 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.607217 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.607226 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.607241 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.607254 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.620332 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.634841 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.647362 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.657368 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.666258 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.674244 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.683163 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:55Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.709634 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.709678 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.709688 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.709702 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.709713 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.811952 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.812343 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.812479 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.812578 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.812669 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.916496 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.916555 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.916571 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.916593 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:55 crc kubenswrapper[4789]: I1216 06:51:55.916607 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:55Z","lastTransitionTime":"2025-12-16T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.019587 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.019639 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.019656 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.019680 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.019697 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.104797 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.104847 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:56 crc kubenswrapper[4789]: E1216 06:51:56.105273 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:56 crc kubenswrapper[4789]: E1216 06:51:56.105160 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.104964 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:56 crc kubenswrapper[4789]: E1216 06:51:56.105375 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.124185 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.125006 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.125115 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.125142 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.125189 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.227571 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.227603 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.227613 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.227629 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.227666 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.330344 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.330400 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.330410 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.330423 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.330431 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.433171 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.433208 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.433218 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.433232 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.433244 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.459443 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/2.log" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.459872 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/1.log" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.463223 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071" exitCode=1 Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.463397 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.463616 4789 scope.go:117] "RemoveContainer" containerID="66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.464215 4789 scope.go:117] "RemoveContainer" containerID="87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071" Dec 16 06:51:56 crc kubenswrapper[4789]: E1216 06:51:56.464352 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.479406 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.497750 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.513550 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.523617 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.535842 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.536602 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.536637 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.536647 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.536663 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.536674 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.548826 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.560780 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.572322 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.583803 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.597187 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.621716 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.637641 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.640116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.640263 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.640364 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.640518 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.640633 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.652963 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.671052 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.686109 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.701750 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.729463 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.744344 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.744524 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.744628 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.744578 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:51:56Z is after 2025-08-24T17:21:41Z" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.744718 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.744766 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.846674 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.846731 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.846749 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.846770 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.846786 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.949618 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.949690 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.949710 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.949738 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:56 crc kubenswrapper[4789]: I1216 06:51:56.949757 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:56Z","lastTransitionTime":"2025-12-16T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.052980 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.053453 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.053872 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.054126 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.054376 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.104571 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:57 crc kubenswrapper[4789]: E1216 06:51:57.104725 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.157730 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.157770 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.157781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.157798 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.157811 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.261654 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.261718 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.261736 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.261763 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.261782 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.365136 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.365613 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.365706 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.365795 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.365887 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.468429 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.468473 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.468485 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.468503 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.468516 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.468752 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/2.log" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.571279 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.571320 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.571331 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.571348 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.571360 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.674404 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.674665 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.674748 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.674824 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.674889 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.777325 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.777387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.777404 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.777428 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.777445 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.879986 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.880054 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.880073 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.880098 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.880116 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.983120 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.983160 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.983169 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.983185 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:57 crc kubenswrapper[4789]: I1216 06:51:57.983195 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:57Z","lastTransitionTime":"2025-12-16T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.085367 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.085404 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.085413 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.085425 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.085436 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.104578 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.104614 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:51:58 crc kubenswrapper[4789]: E1216 06:51:58.104678 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:51:58 crc kubenswrapper[4789]: E1216 06:51:58.104802 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.104829 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:51:58 crc kubenswrapper[4789]: E1216 06:51:58.105102 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.187989 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.188050 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.188068 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.188091 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.188111 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.290365 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.290408 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.290416 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.290431 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.290443 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.392946 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.393015 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.393030 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.393048 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.393060 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.496268 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.496723 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.496745 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.496769 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.496787 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.598811 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.598879 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.598903 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.599010 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.599033 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.702674 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.702740 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.702757 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.702781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.702799 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.805842 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.805904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.805955 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.805982 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.806000 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.908780 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.908839 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.908854 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.908876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:58 crc kubenswrapper[4789]: I1216 06:51:58.908890 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:58Z","lastTransitionTime":"2025-12-16T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.012127 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.012196 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.012218 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.012248 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.012276 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.104981 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:51:59 crc kubenswrapper[4789]: E1216 06:51:59.105282 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.115371 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.115434 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.115462 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.115499 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.115519 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.218856 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.218966 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.218992 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.219023 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.219047 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.322046 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.322124 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.322153 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.322187 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.322211 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.424956 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.425012 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.425030 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.425054 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.425075 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.528295 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.528353 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.528371 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.528394 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.528414 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.632283 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.632352 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.632371 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.632397 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.632415 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.736148 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.736216 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.736237 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.736271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.736295 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.840104 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.840181 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.840203 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.840233 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.840256 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.944399 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.944492 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.944520 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.944558 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:51:59 crc kubenswrapper[4789]: I1216 06:51:59.944585 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:51:59Z","lastTransitionTime":"2025-12-16T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.048044 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.048105 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.048127 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.048156 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.048180 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.104428 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.104468 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:00 crc kubenswrapper[4789]: E1216 06:52:00.104664 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.104766 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:00 crc kubenswrapper[4789]: E1216 06:52:00.105010 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:00 crc kubenswrapper[4789]: E1216 06:52:00.105174 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.151719 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.151791 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.151810 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.151837 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.151856 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.256579 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.256670 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.256684 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.256704 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.256717 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.360491 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.360528 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.360540 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.360557 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.360568 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.463564 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.463660 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.463679 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.463717 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.463737 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.567879 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.568005 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.568025 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.568052 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.568073 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.671583 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.671657 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.671681 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.671710 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.671733 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.775471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.775549 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.775611 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.775646 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.775668 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.879405 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.879475 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.879492 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.879517 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.879531 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.983728 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.983784 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.983800 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.983824 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:00 crc kubenswrapper[4789]: I1216 06:52:00.983839 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:00Z","lastTransitionTime":"2025-12-16T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.087427 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.087515 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.087546 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.087585 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.087612 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.104237 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:01 crc kubenswrapper[4789]: E1216 06:52:01.104437 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.192461 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.192538 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.192551 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.192576 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.192596 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.297057 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.297108 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.297117 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.297138 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.297148 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.401326 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.401464 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.401488 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.401552 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.401577 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.503930 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.504342 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.504453 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.504618 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.504714 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.608621 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.608693 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.608706 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.608726 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.608739 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.650889 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.650965 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.650977 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.650994 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.651028 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: E1216 06:52:01.667447 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.672360 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.672423 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.672440 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.672465 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.672488 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: E1216 06:52:01.690284 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.696731 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.696760 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.696768 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.696784 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.696796 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: E1216 06:52:01.714589 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.720171 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.720227 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.720241 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.720261 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.720275 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: E1216 06:52:01.734798 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.739041 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.739101 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.739117 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.739142 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.739162 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: E1216 06:52:01.766493 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:01Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:01 crc kubenswrapper[4789]: E1216 06:52:01.766612 4789 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.769040 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.769179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.769275 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.769376 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.769465 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.872057 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.872115 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.872131 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.872152 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.872169 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.975593 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.975672 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.975699 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.975765 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:01 crc kubenswrapper[4789]: I1216 06:52:01.975785 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:01Z","lastTransitionTime":"2025-12-16T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.089832 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.090861 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.091074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.091235 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.091384 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.104636 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:02 crc kubenswrapper[4789]: E1216 06:52:02.104745 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.105057 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.105018 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:02 crc kubenswrapper[4789]: E1216 06:52:02.105148 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:02 crc kubenswrapper[4789]: E1216 06:52:02.105398 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.130690 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.150350 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.165688 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.184025 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.197034 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.197075 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.197108 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.197127 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.197191 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.198488 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.209786 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.222661 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.239828 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.249384 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.259726 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.280262 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.293820 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.299676 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.299708 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.299730 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.299747 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.299758 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.304825 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.316647 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.325967 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.337175 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.365872 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.383853 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:02Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.402159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.402187 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.402196 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.402211 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.402220 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.504461 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.504512 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.504522 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.504537 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.504547 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.608223 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.608270 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.608281 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.608300 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.608310 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.710497 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.710599 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.710628 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.710654 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.710673 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.813030 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.813067 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.813075 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.813087 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.813096 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.916773 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.916819 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.916831 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.916850 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:02 crc kubenswrapper[4789]: I1216 06:52:02.916859 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:02Z","lastTransitionTime":"2025-12-16T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.020277 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.020365 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.020390 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.020428 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.020456 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.104679 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:03 crc kubenswrapper[4789]: E1216 06:52:03.105077 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.124253 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.124329 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.124348 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.124377 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.124397 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.227651 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.227692 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.227703 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.227726 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.227740 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.330538 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.330594 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.330610 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.330634 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.330652 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.433123 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.433170 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.433178 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.433194 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.433203 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.536559 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.536632 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.536662 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.536697 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.536723 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.640199 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.640289 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.640308 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.640341 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.640362 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.744016 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.744106 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.744128 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.744157 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.744186 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.848776 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.848836 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.848854 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.848885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.848904 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.952123 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.952196 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.952219 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.952247 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:03 crc kubenswrapper[4789]: I1216 06:52:03.952266 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:03Z","lastTransitionTime":"2025-12-16T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.056010 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.056103 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.056130 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.056160 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.056179 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.104877 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:04 crc kubenswrapper[4789]: E1216 06:52:04.105061 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.105257 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:04 crc kubenswrapper[4789]: E1216 06:52:04.105352 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.105774 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:04 crc kubenswrapper[4789]: E1216 06:52:04.106078 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.159183 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.159221 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.159233 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.159249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.159261 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.262508 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.262554 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.262571 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.262592 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.262609 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.365409 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.365442 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.365449 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.365461 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.365470 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.468113 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.468159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.468174 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.468191 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.468203 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.571893 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.571994 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.572012 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.572039 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.572058 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.675188 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.675223 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.675233 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.675249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.675258 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.778556 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.778596 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.778605 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.778619 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.778628 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.881527 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.881607 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.881620 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.881635 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.881646 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.984630 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.985194 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.985369 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.985524 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:04 crc kubenswrapper[4789]: I1216 06:52:04.985663 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:04Z","lastTransitionTime":"2025-12-16T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.089786 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.089846 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.089863 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.089893 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.089950 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.104075 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:05 crc kubenswrapper[4789]: E1216 06:52:05.104230 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.193509 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.193555 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.193566 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.193585 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.193598 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.296179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.296214 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.296226 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.296242 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.296253 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.399448 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.399483 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.399494 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.399511 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.399521 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.503599 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.503661 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.503683 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.503713 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.503737 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.632092 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.632128 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.632135 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.632149 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.632158 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.734433 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.734472 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.734481 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.734493 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.734506 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.837501 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.837574 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.837592 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.837626 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.837646 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.941124 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.941179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.941199 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.941227 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:05 crc kubenswrapper[4789]: I1216 06:52:05.941248 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:05Z","lastTransitionTime":"2025-12-16T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.044338 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.044387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.044406 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.044427 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.044443 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.104251 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.104318 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.104251 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:06 crc kubenswrapper[4789]: E1216 06:52:06.104494 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:06 crc kubenswrapper[4789]: E1216 06:52:06.104609 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:06 crc kubenswrapper[4789]: E1216 06:52:06.104759 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.146844 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.146892 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.146944 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.146975 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.146996 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.249252 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.249284 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.249292 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.249305 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.249316 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.351595 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.351662 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.351687 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.351718 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.351741 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.454383 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.454413 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.454423 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.454438 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.454449 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.556737 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.556806 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.556820 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.556836 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.556847 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.658762 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.658798 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.658810 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.658827 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.658837 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.746247 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:06 crc kubenswrapper[4789]: E1216 06:52:06.746404 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:52:06 crc kubenswrapper[4789]: E1216 06:52:06.746669 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:52:38.746652557 +0000 UTC m=+97.008540186 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.761573 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.761960 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.761972 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.761989 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.762001 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.864937 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.864978 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.864989 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.865002 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.865011 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.967761 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.967842 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.967865 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.967894 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:06 crc kubenswrapper[4789]: I1216 06:52:06.967950 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:06Z","lastTransitionTime":"2025-12-16T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.070507 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.070547 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.070558 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.070574 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.070589 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.104857 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:07 crc kubenswrapper[4789]: E1216 06:52:07.105001 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.172613 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.172656 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.172667 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.172681 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.172692 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.274551 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.274587 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.274595 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.274607 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.274615 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.376516 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.376555 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.376565 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.376589 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.376598 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.478779 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.478816 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.478851 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.478868 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.478879 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.581741 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.581785 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.581793 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.581806 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.581816 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.685131 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.685183 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.685195 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.685214 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.685226 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.787146 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.787181 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.787192 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.787210 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.787221 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.889279 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.889322 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.889336 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.889351 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.889359 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.992161 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.992203 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.992211 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.992226 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:07 crc kubenswrapper[4789]: I1216 06:52:07.992235 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:07Z","lastTransitionTime":"2025-12-16T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.095013 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.095051 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.095059 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.095071 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.095081 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.104743 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.104773 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.104741 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:08 crc kubenswrapper[4789]: E1216 06:52:08.104866 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:08 crc kubenswrapper[4789]: E1216 06:52:08.105005 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:08 crc kubenswrapper[4789]: E1216 06:52:08.105169 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.197190 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.197234 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.197247 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.197263 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.197272 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.299378 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.299410 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.299418 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.299431 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.299440 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.401988 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.402031 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.402044 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.402060 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.402071 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.504991 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.505023 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.505030 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.505042 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.505051 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.512429 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/0.log" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.512481 4789 generic.go:334] "Generic (PLEG): container finished" podID="32431466-a255-4bf2-9237-4f48eab4a71e" containerID="771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8" exitCode=1 Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.512510 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerDied","Data":"771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.512902 4789 scope.go:117] "RemoveContainer" containerID="771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.525579 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.538016 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.552570 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.567532 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.579558 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.590655 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.602167 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.606883 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.606940 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.606952 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.606970 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.606981 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.612949 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.622506 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.631237 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.647175 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.659621 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.674004 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.688192 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.700502 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.710891 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.710947 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.710959 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.710980 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.710992 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.711766 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.729125 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.741253 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:08Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.813379 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.813418 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.813427 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.813441 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.813451 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.915705 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.915740 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.915749 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.915765 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:08 crc kubenswrapper[4789]: I1216 06:52:08.915774 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:08Z","lastTransitionTime":"2025-12-16T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.017441 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.017476 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.017486 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.017498 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.017507 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.105090 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:09 crc kubenswrapper[4789]: E1216 06:52:09.105233 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.119904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.119953 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.119965 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.119976 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.119987 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.222501 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.222545 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.222553 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.222568 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.222580 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.325478 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.325535 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.325552 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.325575 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.325592 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.428172 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.428214 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.428223 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.428240 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.428251 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.518680 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/0.log" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.518763 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerStarted","Data":"9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.530328 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.530371 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.530383 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.530402 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.530412 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.536244 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.564724 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66ac4d9134de028d2df65317b0d5b71e081da1c896d2ab87eea675312aa08a22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:33Z\\\",\\\"message\\\":\\\"s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.237560 6209 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.237637 6209 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:51:32.237842 6209 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238506 6209 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.238827 6209 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:51:32.239114 6209 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:51:32.239207 6209 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:51:32.241952 6209 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 06:51:32.242018 6209 factory.go:656] Stopping watch factory\\\\nI1216 06:51:32.242041 6209 ovnkube.go:599] Stopped ovnkube\\\\nI1216 06:51:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.576883 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.592852 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.611763 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.627482 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.632775 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.632842 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.632863 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.632888 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.632907 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.645895 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.661884 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.672288 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.684634 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.700339 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.713330 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.726145 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.734807 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.734841 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.734853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.734869 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.734880 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.737645 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.750867 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.762775 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.786537 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.803044 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:09Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.836582 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.836626 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.836641 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.836660 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.836673 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.938691 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.938728 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.938739 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.938756 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:09 crc kubenswrapper[4789]: I1216 06:52:09.938768 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:09Z","lastTransitionTime":"2025-12-16T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.040797 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.040831 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.040839 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.040852 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.040861 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.076297 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.076970 4789 scope.go:117] "RemoveContainer" containerID="87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071" Dec 16 06:52:10 crc kubenswrapper[4789]: E1216 06:52:10.077108 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.093337 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.104582 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:10 crc kubenswrapper[4789]: E1216 06:52:10.104670 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.104719 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:10 crc kubenswrapper[4789]: E1216 06:52:10.104899 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.105186 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:10 crc kubenswrapper[4789]: E1216 06:52:10.105287 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.110698 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.121425 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.132551 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.142474 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.142497 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.142505 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.142518 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.142527 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.154358 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.165333 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.176831 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.188766 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.203438 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.221221 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.235635 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.245726 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.245783 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.245804 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.245827 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.245846 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.249261 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.268204 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.289337 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.305992 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.319170 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.329088 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.341260 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:10Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.347847 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.347908 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.347952 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.347976 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.347992 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.450498 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.450547 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.450560 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.450579 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.450593 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.552401 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.552460 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.552472 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.552490 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.552504 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.655220 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.655246 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.655254 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.655268 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.655276 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.757410 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.757443 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.757452 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.757466 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.757475 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.859148 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.859435 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.859498 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.859565 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.859628 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.962150 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.962186 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.962195 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.962212 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:10 crc kubenswrapper[4789]: I1216 06:52:10.962224 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:10Z","lastTransitionTime":"2025-12-16T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.064356 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.064400 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.064413 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.064431 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.064443 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.104361 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:11 crc kubenswrapper[4789]: E1216 06:52:11.104565 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.166674 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.166753 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.166773 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.166799 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.166817 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.269218 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.269252 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.269260 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.269272 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.269281 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.373104 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.373135 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.373147 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.373164 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.373177 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.476098 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.476154 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.476177 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.476205 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.476228 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.579622 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.579653 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.579660 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.579674 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.579690 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.682236 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.682271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.682280 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.682293 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.682304 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.784285 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.784328 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.784336 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.784349 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.784358 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.886380 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.886427 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.886445 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.886466 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.886483 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.989043 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.989117 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.989142 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.989213 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:11 crc kubenswrapper[4789]: I1216 06:52:11.989234 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:11Z","lastTransitionTime":"2025-12-16T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.091589 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.091648 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.091665 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.091687 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.091704 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.103984 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.104042 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.104061 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.104128 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.104200 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.104379 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.117965 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.118193 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.118259 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.118321 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.118403 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.125717 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.130539 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.134320 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.134462 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.134533 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.134597 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.134660 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.137336 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.146612 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.153592 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.153752 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.153935 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.154503 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.154547 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.154562 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.166809 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.171159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.171198 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.171216 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.171238 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.171255 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.173375 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.184040 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.189950 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.193953 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.193985 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.193994 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.194008 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.194019 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.198718 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.224857 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.225648 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: E1216 06:52:12.225893 4789 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.227336 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.227415 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.227472 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.227531 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.227599 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.239516 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.260538 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.276339 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.289877 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.300491 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.311439 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.320551 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.329873 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.329897 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.329906 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.329929 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.329939 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.333743 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.354613 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.367173 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.378670 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:12Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.432513 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.432551 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.432561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.432576 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.432590 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.534897 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.535005 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.535029 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.535062 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.535086 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.637853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.637894 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.637902 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.637947 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.637960 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.740473 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.740529 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.740540 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.740558 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.740569 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.842699 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.842737 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.842746 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.842761 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.842771 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.945014 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.945360 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.945455 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.945539 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:12 crc kubenswrapper[4789]: I1216 06:52:12.945637 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:12Z","lastTransitionTime":"2025-12-16T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.048468 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.048512 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.048528 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.048544 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.048555 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.104897 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:13 crc kubenswrapper[4789]: E1216 06:52:13.105050 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.150794 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.150864 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.150881 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.150903 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.150957 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.253842 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.253904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.253958 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.253986 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.254006 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.357229 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.357523 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.357694 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.357837 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.358002 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.461131 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.461713 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.461819 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.461956 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.462041 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.564326 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.564372 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.564380 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.564393 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.564402 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.666562 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.666600 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.666608 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.666620 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.666629 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.769614 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.769676 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.769692 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.769714 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.769730 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.872284 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.872488 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.872573 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.872664 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.872737 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.975983 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.976295 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.976456 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.976622 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:13 crc kubenswrapper[4789]: I1216 06:52:13.976761 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:13Z","lastTransitionTime":"2025-12-16T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.079522 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.079561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.079573 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.079587 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.079599 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.104533 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:14 crc kubenswrapper[4789]: E1216 06:52:14.104681 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.104780 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:14 crc kubenswrapper[4789]: E1216 06:52:14.104907 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.104556 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:14 crc kubenswrapper[4789]: E1216 06:52:14.105214 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.182348 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.182394 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.182405 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.182420 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.182430 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.284399 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.284436 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.284446 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.284460 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.284470 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.387400 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.387478 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.387495 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.387515 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.387529 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.490064 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.490110 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.490121 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.490137 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.490149 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.592055 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.592092 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.592103 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.592117 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.592128 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.694517 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.694561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.694574 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.694590 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.694602 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.796769 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.796812 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.796827 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.796846 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.796860 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.899221 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.899274 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.899290 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.899312 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:14 crc kubenswrapper[4789]: I1216 06:52:14.899329 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:14Z","lastTransitionTime":"2025-12-16T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.002203 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.002243 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.002252 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.002265 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.002276 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.103836 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:15 crc kubenswrapper[4789]: E1216 06:52:15.104052 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.105060 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.105121 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.105144 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.105170 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.105188 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.207812 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.207887 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.208116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.208152 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.208174 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.310268 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.310313 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.310327 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.310344 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.310356 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.412446 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.412497 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.412510 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.412529 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.412543 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.515476 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.515572 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.515599 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.515630 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.515652 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.618464 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.618508 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.618520 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.618536 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.618548 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.721074 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.721117 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.721132 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.721153 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.721170 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.825715 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.825763 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.825773 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.825787 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.825800 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.928491 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.928523 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.928531 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.928543 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:15 crc kubenswrapper[4789]: I1216 06:52:15.928552 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:15Z","lastTransitionTime":"2025-12-16T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.031226 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.031299 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.031308 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.031322 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.031332 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.103942 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:16 crc kubenswrapper[4789]: E1216 06:52:16.104065 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.104257 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:16 crc kubenswrapper[4789]: E1216 06:52:16.104312 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.104522 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:16 crc kubenswrapper[4789]: E1216 06:52:16.104579 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.133306 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.133339 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.133351 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.133364 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.133375 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.235575 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.235616 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.235627 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.235642 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.235655 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.338574 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.339116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.339285 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.339487 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.339618 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.443499 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.443539 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.443549 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.443609 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.443622 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.546083 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.546141 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.546159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.546180 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.546197 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.648637 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.648678 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.648686 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.648699 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.648709 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.751284 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.751334 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.751346 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.751361 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.751372 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.853849 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.853940 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.853957 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.853981 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.853999 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.956555 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.956601 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.956617 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.956635 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:16 crc kubenswrapper[4789]: I1216 06:52:16.956650 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:16Z","lastTransitionTime":"2025-12-16T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.060045 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.060113 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.060137 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.060170 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.060193 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.105013 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:17 crc kubenswrapper[4789]: E1216 06:52:17.105313 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.162396 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.162429 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.162438 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.162451 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.162461 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.264980 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.265021 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.265029 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.265042 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.265054 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.367990 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.368069 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.368091 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.368115 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.368133 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.471378 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.471454 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.471477 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.471507 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.471529 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.574220 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.574284 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.574300 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.574326 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.574345 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.677388 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.677452 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.677470 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.677496 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.677515 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.779409 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.779444 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.779576 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.779601 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.779640 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.882214 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.882263 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.882274 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.882292 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.882303 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.985675 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.985743 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.985760 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.985783 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:17 crc kubenswrapper[4789]: I1216 06:52:17.985800 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:17Z","lastTransitionTime":"2025-12-16T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.088502 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.088564 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.088583 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.088609 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.088629 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.104139 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.104211 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:18 crc kubenswrapper[4789]: E1216 06:52:18.104250 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.104214 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:18 crc kubenswrapper[4789]: E1216 06:52:18.104372 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:18 crc kubenswrapper[4789]: E1216 06:52:18.104476 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.191131 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.191199 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.191211 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.191228 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.191242 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.294153 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.294252 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.294264 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.294279 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.294291 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.396731 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.396853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.396901 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.396968 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.396980 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.500436 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.500499 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.500517 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.500542 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.500560 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.633391 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.633426 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.633434 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.633450 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.633460 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.735488 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.735558 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.735584 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.735612 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.735631 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.838405 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.838462 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.838476 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.838499 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.838515 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.941717 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.941788 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.941806 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.941829 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:18 crc kubenswrapper[4789]: I1216 06:52:18.941848 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:18Z","lastTransitionTime":"2025-12-16T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.044746 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.044805 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.044823 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.044848 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.044865 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.104229 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:19 crc kubenswrapper[4789]: E1216 06:52:19.104374 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.147409 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.147452 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.147464 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.147485 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.147509 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.251564 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.251605 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.251615 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.251631 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.251642 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.355184 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.355245 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.355263 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.355287 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.355304 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.457096 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.457134 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.457145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.457160 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.457171 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.559701 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.559739 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.559747 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.559764 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.559776 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.662223 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.662311 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.662327 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.662349 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.662361 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.764843 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.764887 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.764900 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.764937 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.764978 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.868017 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.868093 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.868116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.868145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.868168 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.971500 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.971537 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.971545 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.971561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:19 crc kubenswrapper[4789]: I1216 06:52:19.971570 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:19Z","lastTransitionTime":"2025-12-16T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.074685 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.074771 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.074802 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.074833 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.074850 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.105246 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.105317 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.105319 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:20 crc kubenswrapper[4789]: E1216 06:52:20.105450 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:20 crc kubenswrapper[4789]: E1216 06:52:20.105577 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:20 crc kubenswrapper[4789]: E1216 06:52:20.105706 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.178031 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.178095 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.178157 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.178183 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.178195 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.282562 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.282637 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.282650 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.282668 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.282682 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.385971 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.386010 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.386026 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.386045 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.386059 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.488349 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.488418 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.488434 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.488471 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.488492 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.592282 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.592324 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.592337 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.592366 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.592379 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.695526 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.695561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.695572 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.695591 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.695906 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.797731 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.797770 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.797779 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.797795 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.797806 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.899880 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.899967 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.899981 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.900000 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:20 crc kubenswrapper[4789]: I1216 06:52:20.900012 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:20Z","lastTransitionTime":"2025-12-16T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.002800 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.002850 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.002859 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.002872 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.002880 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.103819 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:21 crc kubenswrapper[4789]: E1216 06:52:21.103957 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.105691 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.105723 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.105734 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.105747 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.105783 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.208023 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.208094 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.208116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.208145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.208168 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.310466 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.310538 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.310558 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.310586 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.310611 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.413901 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.413965 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.413975 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.413991 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.414003 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.516089 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.516157 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.516180 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.516213 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.516238 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.619145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.619210 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.619227 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.619249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.619267 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.721414 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.721454 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.721463 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.721476 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.721487 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.823110 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.823168 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.823181 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.823199 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.823212 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.925717 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.925756 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.925771 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.925787 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:21 crc kubenswrapper[4789]: I1216 06:52:21.925798 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:21Z","lastTransitionTime":"2025-12-16T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.029009 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.029340 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.029497 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.029646 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.029767 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.104497 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.104893 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.104643 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.105123 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.104522 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.105299 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.120423 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.131653 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.131696 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.131708 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.131723 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.131735 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.132475 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.142349 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.161618 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.172563 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.185268 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.201868 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.215948 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.227136 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.233518 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.233547 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.233578 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.233592 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.233601 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.238507 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.249883 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.263367 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.272630 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.272665 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.272675 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.272692 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.272704 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.275272 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.285217 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.285844 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.288339 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.288379 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.288387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.288401 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.288409 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.297125 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.300518 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.303698 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.303738 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.303765 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.303790 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.303804 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.315363 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.315387 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.318825 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.318853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.318862 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.318875 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.318885 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.327508 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.329199 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.332855 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.332893 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.332903 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.332923 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.332946 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.341384 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.343759 4789 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab55fff6-145f-4b59-9cc0-4a36ab767ab4\\\",\\\"systemUUID\\\":\\\"6f743a75-a9db-425a-b0df-337b667d61cb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:22Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:22 crc kubenswrapper[4789]: E1216 06:52:22.343895 4789 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.345234 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.345263 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.345271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.345284 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.345293 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.447146 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.447193 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.447202 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.447219 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.447229 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.549821 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.550107 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.550220 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.550298 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.550376 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.655641 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.656004 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.656099 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.656189 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.656254 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.759144 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.759202 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.759220 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.759242 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.759260 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.862077 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.862116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.862126 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.862142 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.862152 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.966284 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.966325 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.966337 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.966352 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:22 crc kubenswrapper[4789]: I1216 06:52:22.966364 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:22Z","lastTransitionTime":"2025-12-16T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.069096 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.069146 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.069159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.069178 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.069192 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.104296 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:23 crc kubenswrapper[4789]: E1216 06:52:23.104465 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.172381 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.172440 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.172459 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.172479 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.172495 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.275092 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.275179 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.275205 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.275240 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.275267 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.379082 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.379145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.379162 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.379186 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.379203 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.482409 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.482782 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.483040 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.483210 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.483353 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.586772 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.586836 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.586858 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.586885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.586905 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.690273 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.690349 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.690372 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.690402 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.690423 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.793504 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.793546 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.793560 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.793578 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.793590 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.896508 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.896542 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.896553 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.896573 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.896593 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.998621 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.998652 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.998684 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.998698 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:23 crc kubenswrapper[4789]: I1216 06:52:23.998706 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:23Z","lastTransitionTime":"2025-12-16T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.014333 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.014401 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.014462 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.014563 4789 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.014618 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.014592631 +0000 UTC m=+146.276480260 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.014742 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.014732953 +0000 UTC m=+146.276620582 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.014805 4789 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.014828 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.014822336 +0000 UTC m=+146.276709955 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.101369 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.101452 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.101501 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.101519 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.101529 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.104936 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.105035 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.105195 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.105266 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.105304 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.105626 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.106282 4789 scope.go:117] "RemoveContainer" containerID="87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.115069 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.115146 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.115362 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.115416 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.115439 4789 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.115517 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.115485992 +0000 UTC m=+146.377373651 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.116538 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.116564 4789 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.116578 4789 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:52:24 crc kubenswrapper[4789]: E1216 06:52:24.116631 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.116610936 +0000 UTC m=+146.378498795 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.204556 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.205211 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.205233 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.205257 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.205276 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.308603 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.308649 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.308661 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.308680 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.308691 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.412218 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.412280 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.412299 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.412323 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.412341 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.515597 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.515677 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.515696 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.515732 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.515758 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.619116 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.619175 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.619191 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.619214 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.619420 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.722813 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.722876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.722894 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.722955 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.722985 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.825550 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.825624 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.825634 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.825650 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.825662 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.929069 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.929147 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.929159 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.929177 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:24 crc kubenswrapper[4789]: I1216 06:52:24.929189 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:24Z","lastTransitionTime":"2025-12-16T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.031612 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.031678 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.031697 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.031723 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.031739 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.104480 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:25 crc kubenswrapper[4789]: E1216 06:52:25.104775 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.135617 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.135681 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.135704 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.135732 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.135755 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.238700 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.238728 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.238736 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.238751 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.238762 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.343037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.343110 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.343137 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.343172 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.343201 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.445851 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.445891 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.445904 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.445932 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.445943 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.548111 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.548145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.548154 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.548168 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.548178 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.570725 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/2.log" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.573392 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.573996 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.591476 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.603149 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.615042 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.624666 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.633768 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.645395 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.650310 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.650349 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.650360 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.650375 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.650384 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.657076 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.666812 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.678848 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.693517 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.707662 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.719373 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.731188 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.739725 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.747890 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.762914 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.771572 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.771612 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.771623 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.771640 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.771652 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.774319 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.787316 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:25Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.873510 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.873544 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.873553 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.873573 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.873584 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.976273 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.976336 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.976349 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.976362 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:25 crc kubenswrapper[4789]: I1216 06:52:25.976371 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:25Z","lastTransitionTime":"2025-12-16T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.078275 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.078309 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.078320 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.078337 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.078347 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.104407 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.104406 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.104560 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:26 crc kubenswrapper[4789]: E1216 06:52:26.104673 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:26 crc kubenswrapper[4789]: E1216 06:52:26.104731 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:26 crc kubenswrapper[4789]: E1216 06:52:26.104809 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.180787 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.180818 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.180826 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.180839 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.180849 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.283320 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.283359 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.283374 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.283396 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.283411 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.386056 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.386096 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.386107 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.386124 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.386138 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.489196 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.489236 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.489244 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.489257 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.489269 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.578844 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/3.log" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.579783 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/2.log" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.583121 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" exitCode=1 Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.583228 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.583420 4789 scope.go:117] "RemoveContainer" containerID="87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.585062 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 06:52:26 crc kubenswrapper[4789]: E1216 06:52:26.585676 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.591312 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.591585 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.591986 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.592360 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.593282 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.598180 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.634994 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.649014 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.662809 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.679041 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.688728 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.696307 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.696361 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.696375 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.696390 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.696402 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.698007 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.719972 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87f3a907a7ebb74d53ecc5f06953b82397c892466a7fb5f64e1f66ce3e6d8071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:51:56Z\\\",\\\"message\\\":\\\"eLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 06:51:55.815293 6485 lb_config.go:1031] Cluster endpoints for openshift-marketplace/certified-operators for network=default are: map[]\\\\nI1216 06:51:55.815310 6485 services_controller.go:443] Built service openshift-marketplace/certified-operators LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.214\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1216 06:51:55.815329 6485 services_controller.go:444] Built service openshift-marketplace/certified-operators LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1216 06:51:55.814870 6485 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:26Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI1216 06:52:25.851542 6917 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:52:25.851471 6917 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:52:25.851684 6917 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:52:25.851726 6917 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:52:25.851977 6917 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:52:25.852106 6917 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:52:25.852284 6917 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:52:25.852321 6917 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:52:25.852569 6917 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:52:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.729128 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.741457 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.751842 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.764509 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.778726 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.791335 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.798844 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.798885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.798893 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.798932 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.798943 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.804594 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.816074 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.830283 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.839375 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:26Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.901504 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.901544 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.901552 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.901564 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:26 crc kubenswrapper[4789]: I1216 06:52:26.901573 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:26Z","lastTransitionTime":"2025-12-16T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.003875 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.003931 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.003957 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.003972 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.003982 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.104635 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:27 crc kubenswrapper[4789]: E1216 06:52:27.104777 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.106093 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.106145 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.106158 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.106178 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.106191 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.209121 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.209164 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.209173 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.209188 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.209199 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.311502 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.311550 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.311562 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.311578 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.311592 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.414219 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.414302 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.414329 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.414361 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.414385 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.517058 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.517130 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.517157 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.517190 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.517217 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.590186 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/3.log" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.594234 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 06:52:27 crc kubenswrapper[4789]: E1216 06:52:27.594434 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.609475 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.620398 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.620429 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.620438 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.620450 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.620460 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.623687 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.640865 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.657044 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.671391 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca24a4b9-4b99-4de7-887d-f8804a4f06bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34dd29422d6a5a26db757f8da234ccf9490c700e463af325429496ec91113a94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ktmbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pdg87\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.685619 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-58dsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32431466-a255-4bf2-9237-4f48eab4a71e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:07Z\\\",\\\"message\\\":\\\"2025-12-16T06:51:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39\\\\n2025-12-16T06:51:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cf2fc148-6696-4ba5-8d06-b5b785bf4f39 to /host/opt/cni/bin/\\\\n2025-12-16T06:51:22Z [verbose] multus-daemon started\\\\n2025-12-16T06:51:22Z [verbose] Readiness Indicator file check\\\\n2025-12-16T06:52:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:52:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5shb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-58dsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.700392 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529ecdde-d194-4bf4-9e89-4accd6630349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bfe6df598b1d1663c143d7ccaab378adf6f5190e0e96dff2817cf68803c6bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e9f5929139ba16a4a33d4bb8406956cc3592ff6ab5330ae3d750ad0bf79297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa11e0a9dfcb444bceed2de322a3f87bbd075f1db18d67fe6dab0bf6bdba8c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7059912aa32241b157851fc41c0c3dbe7512592f5aae77af1ac83c1aab5abc66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://642012f6bcfb4cea040dea8045b58b9a7abd7d7e50743b190eb1bf4505562d70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2504a2fab391f6d3c7b7aee969df6ad46e2fb303d9d15d2dd1bcdff8da72a1eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4051fb2e4c8ace071408978bfe2eb4e40770fe42a8478f5fc37db129e0485552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xcgv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b8tnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.713375 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.726766 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.726830 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.726855 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.726876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.726906 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.727568 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8c7br\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ttcm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.740653 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6cbf639-2df9-4d83-965e-148cb7787b12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e9754c3101318dd5eed3eb00eca7e720729bc66fba004eeade08d8e3555778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a811638fdab112ec63566b34338f533b08fbda1d2b065b3e72db2444e4cf51a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x8b7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.755554 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.769198 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.791525 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"614e4a27-04ae-4306-a30c-c0d2cbe6cfde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40682c9d498f305712220e488af4084b76ca1f6260e842f31506492de1a8978e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a41a8b688ad56a59957f5f0b0ae12754202784881e4c799bb5e4a75ef3a18270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8787345e72720ced01c7e651942d4c16a4275a1f9528f7a54c8f949cf406335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ec512391ccca913041ddf3b6a4deaf7e3d5f094d8b3301b9b8d37de1d59b86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7219c08e92224a6b2b7ff2c0e5fe5f03f7cd952c117a6d16fc6e2a9e515cfd71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7cd7d0a1d6b31641f8d69256a5407192a809bba48850aa593f6070270228f2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6911619fa93261efabe26ff68f973981b013097df4fd30fea6ece3586810fbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://715deab368a6c473cdf73bf941456a051ef25d60e9d7cb5e74a18c0bc0f6f47a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.804758 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7874f702165576d4b3afb7bce129308f5b4e0f2f44d0a778c42146b3bcc48b23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.814413 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ckj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"483d75f6-45a1-4182-b56b-9eff94bbed13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c142beab8232d1888af40c338d428c3d7fa28b9d7d465df99d2040494c0bbd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hdqkj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ckj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.829457 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.829502 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.829514 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.829533 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.829547 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.831259 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02a3f8b3-6393-4e58-9b49-506f85204b08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T06:52:26Z\\\",\\\"message\\\":\\\"alversions/factory.go:140\\\\nI1216 06:52:25.851542 6917 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:52:25.851471 6917 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 06:52:25.851684 6917 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:52:25.851726 6917 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 06:52:25.851977 6917 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:52:25.852106 6917 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:52:25.852284 6917 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 06:52:25.852321 6917 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 06:52:25.852569 6917 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:52:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-blqv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pbvfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.840661 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wjmvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"902255f3-ae7f-4bce-bf64-b50fe8753a2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2b4ee3b425ebd3314fcbe7784144b17bf8900866906033b2022ac0dbd63a18b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lbfjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wjmvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.855882 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6fad8a2-3742-469d-be15-46a42233af5b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"HA384' detected.\\\\nW1216 06:51:20.076843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 06:51:20.076847 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 06:51:20.077037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1216 06:51:20.083889 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765867864\\\\\\\\\\\\\\\" (2025-12-16 06:51:03 +0000 UTC to 2026-01-15 06:51:04 +0000 UTC (now=2025-12-16 06:51:20.08386226 +0000 UTC))\\\\\\\"\\\\nF1216 06:51:20.085040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 06:51:20.117287 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1765867875\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1765867874\\\\\\\\\\\\\\\" (2025-12-16 05:51:14 +0000 UTC to 2026-12-16 05:51:14 +0000 UTC (now=2025-12-16 06:51:20.117241054 +0000 UTC))\\\\\\\"\\\\nI1216 06:51:20.117325 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1216 06:51:20.117360 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1216 06:51:20.117375 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1216 06:51:20.117868 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1216 06:51:20.117869 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4246151639/tls.crt::/tmp/serving-cert-4246151639/tls.key\\\\\\\"\\\\nI1216 06:51:20.117907 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:27Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.932816 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.932877 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.932886 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.932900 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:27 crc kubenswrapper[4789]: I1216 06:52:27.932909 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:27Z","lastTransitionTime":"2025-12-16T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.035697 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.035737 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.035746 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.035759 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.035770 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.104266 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.104353 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.104369 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:28 crc kubenswrapper[4789]: E1216 06:52:28.104551 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:28 crc kubenswrapper[4789]: E1216 06:52:28.104683 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:28 crc kubenswrapper[4789]: E1216 06:52:28.104817 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.139441 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.139508 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.139525 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.139742 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.139773 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.242155 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.242301 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.242333 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.242361 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.242378 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.345483 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.345522 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.345531 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.345544 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.345552 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.448883 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.448965 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.448977 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.448995 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.449008 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.551164 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.551224 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.551241 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.551263 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.551281 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.654084 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.654124 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.654134 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.654147 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.654157 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.758814 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.758923 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.758934 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.758951 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.758963 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.862982 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.863041 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.863058 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.863080 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.863094 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.966774 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.966850 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.966872 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.966898 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:28 crc kubenswrapper[4789]: I1216 06:52:28.966951 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:28Z","lastTransitionTime":"2025-12-16T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.070545 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.070612 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.070626 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.070648 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.070663 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.104447 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:29 crc kubenswrapper[4789]: E1216 06:52:29.104673 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.173561 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.173603 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.173611 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.173624 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.173634 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.275982 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.276062 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.276077 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.276105 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.276125 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.379677 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.379716 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.379725 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.379740 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.379749 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.482839 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.482899 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.482941 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.482966 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.482982 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.586272 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.586316 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.586325 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.586341 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.586351 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.689408 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.689475 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.689493 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.689521 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.689539 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.797362 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.797431 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.797451 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.797474 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.797491 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.900722 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.900759 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.900768 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.900783 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:29 crc kubenswrapper[4789]: I1216 06:52:29.900795 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:29Z","lastTransitionTime":"2025-12-16T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.003700 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.003770 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.003790 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.003818 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.003840 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.104395 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.104476 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.104395 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:30 crc kubenswrapper[4789]: E1216 06:52:30.104643 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:30 crc kubenswrapper[4789]: E1216 06:52:30.104759 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:30 crc kubenswrapper[4789]: E1216 06:52:30.104792 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.106212 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.106257 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.106267 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.106282 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.106292 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.122737 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.209266 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.209331 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.209347 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.209365 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.209378 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.312681 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.312728 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.312738 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.312753 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.312764 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.415823 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.415874 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.415885 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.415903 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.415937 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.519221 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.519271 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.519282 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.519301 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.519312 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.623044 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.623106 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.623120 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.623139 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.623154 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.727544 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.727613 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.727637 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.727668 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.727693 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.830487 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.830552 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.830569 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.830588 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.830602 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.933249 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.933331 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.933355 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.933391 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:30 crc kubenswrapper[4789]: I1216 06:52:30.933413 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:30Z","lastTransitionTime":"2025-12-16T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.036897 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.036977 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.036989 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.037005 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.037018 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.104722 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:31 crc kubenswrapper[4789]: E1216 06:52:31.105009 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.141995 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.142061 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.142079 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.142108 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.142127 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.246302 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.246387 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.246410 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.246443 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.246463 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.350742 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.350795 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.350809 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.350828 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.350842 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.454781 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.454862 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.454876 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.454898 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.454951 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.558754 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.558853 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.558951 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.558983 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.559003 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.662870 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.662949 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.662963 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.662988 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.663006 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.766822 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.766897 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.766945 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.766975 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.766996 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.870364 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.870416 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.870431 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.870453 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.870468 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.974617 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.974704 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.974729 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.974760 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:31 crc kubenswrapper[4789]: I1216 06:52:31.974781 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:31Z","lastTransitionTime":"2025-12-16T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.078015 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.078849 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.079210 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.080016 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.080245 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.105126 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.105367 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:32 crc kubenswrapper[4789]: E1216 06:52:32.105602 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.106232 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:32 crc kubenswrapper[4789]: E1216 06:52:32.106742 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:32 crc kubenswrapper[4789]: E1216 06:52:32.107164 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.134035 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b8d60bf-5e01-4c05-aaed-e860945db87f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c9be261070111b1201d0c82fa59431c364f12ca850caca10835123f82db00f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aee33ed8e29795318d121988c8708da3bf70af241fc23078228a70b59f89a337\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fa91e93320d79b2a02ceb0eb75951112c9ada53d0b79f597ddc29b98dbbdda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.153419 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80b3e888-736e-486f-9e31-1f75e38be511\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14fda4a61995f70846a97ef6ca2cbdea21c324e045d2f3b2e9e90ad2b646f805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3d77c8fb0605304dc0f86adbb343b9cf5fe5f8c2edc12e0a745f5c0f7245e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8c95e5432705496241308e41be339e1ee26de12c2138800ff5b0ec671f02cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1db8eac0771831e460b2ad7bcddbdac0b1a4026c89285f9268352b3c594ef7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T06:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T06:51:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T06:51:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.169288 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b1fa6494900290f8f53c4746d85afbbe182065edab31fedde317021c08ac03a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.183100 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.183373 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.183524 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.183665 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.183773 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.188322 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bb4aa78d2e3bbcd37c77785b90405bcdb2556a943211272271967460a0cbf22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed6fe31f7e04329df68e0d2690462b374d730b3e9e816b5017efe20245175835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T06:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.204633 4789 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T06:51:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T06:52:32Z is after 2025-08-24T17:21:41Z" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.241841 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podStartSLOduration=72.241812949 podStartE2EDuration="1m12.241812949s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.224125261 +0000 UTC m=+90.486012910" watchObservedRunningTime="2025-12-16 06:52:32.241812949 +0000 UTC m=+90.503700588" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.262048 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-58dsj" podStartSLOduration=72.26202811 podStartE2EDuration="1m12.26202811s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.242126206 +0000 UTC m=+90.504013845" watchObservedRunningTime="2025-12-16 06:52:32.26202811 +0000 UTC m=+90.523915739" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.272957 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b8tnx" podStartSLOduration=71.272938718 podStartE2EDuration="1m11.272938718s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.262250085 +0000 UTC m=+90.524137724" watchObservedRunningTime="2025-12-16 06:52:32.272938718 +0000 UTC m=+90.534826347" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.284960 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.284895786 podStartE2EDuration="2.284895786s" podCreationTimestamp="2025-12-16 06:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.283540168 +0000 UTC m=+90.545427797" watchObservedRunningTime="2025-12-16 06:52:32.284895786 +0000 UTC m=+90.546783455" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.286006 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.286054 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.286065 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.286084 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.286098 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.299767 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x8b7m" podStartSLOduration=71.299752526 podStartE2EDuration="1m11.299752526s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.299615583 +0000 UTC m=+90.561503222" watchObservedRunningTime="2025-12-16 06:52:32.299752526 +0000 UTC m=+90.561640145" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.323426 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.323406359 podStartE2EDuration="1m11.323406359s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.322327316 +0000 UTC m=+90.584214955" watchObservedRunningTime="2025-12-16 06:52:32.323406359 +0000 UTC m=+90.585293998" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.378244 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.37822277 podStartE2EDuration="1m12.37822277s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.365184379 +0000 UTC m=+90.627072048" watchObservedRunningTime="2025-12-16 06:52:32.37822277 +0000 UTC m=+90.640110409" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.389002 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.389061 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.389070 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.389115 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.389123 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.390254 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ckj56" podStartSLOduration=73.39023996 podStartE2EDuration="1m13.39023996s" podCreationTimestamp="2025-12-16 06:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.39024114 +0000 UTC m=+90.652128779" watchObservedRunningTime="2025-12-16 06:52:32.39023996 +0000 UTC m=+90.652127589" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.421662 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wjmvq" podStartSLOduration=72.421637255 podStartE2EDuration="1m12.421637255s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.42094533 +0000 UTC m=+90.682832959" watchObservedRunningTime="2025-12-16 06:52:32.421637255 +0000 UTC m=+90.683524904" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.491768 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.491822 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.491837 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.491857 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.491867 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.594039 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.594082 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.594093 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.594110 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.594122 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.696696 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.696735 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.696744 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.696759 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.696769 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.720037 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.720083 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.720095 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.720113 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.720125 4789 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T06:52:32Z","lastTransitionTime":"2025-12-16T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.763478 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt"] Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.764215 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.766617 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.766649 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.766826 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.767476 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.779979 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=73.779954957 podStartE2EDuration="1m13.779954957s" podCreationTimestamp="2025-12-16 06:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.778945516 +0000 UTC m=+91.040833145" watchObservedRunningTime="2025-12-16 06:52:32.779954957 +0000 UTC m=+91.041842586" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.793629 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.793603431 podStartE2EDuration="38.793603431s" podCreationTimestamp="2025-12-16 06:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:32.79307784 +0000 UTC m=+91.054965469" watchObservedRunningTime="2025-12-16 06:52:32.793603431 +0000 UTC m=+91.055491050" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.814142 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f0219dd4-370e-4131-89b1-9d8afc357794-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.814201 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0219dd4-370e-4131-89b1-9d8afc357794-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.814232 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0219dd4-370e-4131-89b1-9d8afc357794-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.814258 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0219dd4-370e-4131-89b1-9d8afc357794-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.814288 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f0219dd4-370e-4131-89b1-9d8afc357794-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.915018 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f0219dd4-370e-4131-89b1-9d8afc357794-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.915080 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f0219dd4-370e-4131-89b1-9d8afc357794-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.915159 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0219dd4-370e-4131-89b1-9d8afc357794-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.915214 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0219dd4-370e-4131-89b1-9d8afc357794-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.915267 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0219dd4-370e-4131-89b1-9d8afc357794-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.915546 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f0219dd4-370e-4131-89b1-9d8afc357794-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.915684 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f0219dd4-370e-4131-89b1-9d8afc357794-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.917628 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0219dd4-370e-4131-89b1-9d8afc357794-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.926699 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0219dd4-370e-4131-89b1-9d8afc357794-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:32 crc kubenswrapper[4789]: I1216 06:52:32.932822 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0219dd4-370e-4131-89b1-9d8afc357794-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2m7dt\" (UID: \"f0219dd4-370e-4131-89b1-9d8afc357794\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:33 crc kubenswrapper[4789]: I1216 06:52:33.078111 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" Dec 16 06:52:33 crc kubenswrapper[4789]: W1216 06:52:33.095348 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0219dd4_370e_4131_89b1_9d8afc357794.slice/crio-850c64bcee15de093a39a29aec3069fad5415c4a7707692fc63a5fb3d4a66704 WatchSource:0}: Error finding container 850c64bcee15de093a39a29aec3069fad5415c4a7707692fc63a5fb3d4a66704: Status 404 returned error can't find the container with id 850c64bcee15de093a39a29aec3069fad5415c4a7707692fc63a5fb3d4a66704 Dec 16 06:52:33 crc kubenswrapper[4789]: I1216 06:52:33.104613 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:33 crc kubenswrapper[4789]: E1216 06:52:33.105040 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:33 crc kubenswrapper[4789]: I1216 06:52:33.617933 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" event={"ID":"f0219dd4-370e-4131-89b1-9d8afc357794","Type":"ContainerStarted","Data":"5c72cc258d9b92575f4c996a2d03cc5e8dddbda0f372e8ab996f908bd301d50f"} Dec 16 06:52:33 crc kubenswrapper[4789]: I1216 06:52:33.618274 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" event={"ID":"f0219dd4-370e-4131-89b1-9d8afc357794","Type":"ContainerStarted","Data":"850c64bcee15de093a39a29aec3069fad5415c4a7707692fc63a5fb3d4a66704"} Dec 16 06:52:33 crc kubenswrapper[4789]: I1216 06:52:33.645335 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2m7dt" podStartSLOduration=73.6453147 podStartE2EDuration="1m13.6453147s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:52:33.644328059 +0000 UTC m=+91.906215728" watchObservedRunningTime="2025-12-16 06:52:33.6453147 +0000 UTC m=+91.907202349" Dec 16 06:52:34 crc kubenswrapper[4789]: I1216 06:52:34.104315 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:34 crc kubenswrapper[4789]: I1216 06:52:34.104315 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:34 crc kubenswrapper[4789]: E1216 06:52:34.104432 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:34 crc kubenswrapper[4789]: I1216 06:52:34.104507 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:34 crc kubenswrapper[4789]: E1216 06:52:34.104600 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:34 crc kubenswrapper[4789]: E1216 06:52:34.104759 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:35 crc kubenswrapper[4789]: I1216 06:52:35.105226 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:35 crc kubenswrapper[4789]: E1216 06:52:35.105491 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:36 crc kubenswrapper[4789]: I1216 06:52:36.104928 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:36 crc kubenswrapper[4789]: E1216 06:52:36.105078 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:36 crc kubenswrapper[4789]: I1216 06:52:36.105132 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:36 crc kubenswrapper[4789]: I1216 06:52:36.105229 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:36 crc kubenswrapper[4789]: E1216 06:52:36.105278 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:36 crc kubenswrapper[4789]: E1216 06:52:36.105507 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:37 crc kubenswrapper[4789]: I1216 06:52:37.104025 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:37 crc kubenswrapper[4789]: E1216 06:52:37.104427 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:38 crc kubenswrapper[4789]: I1216 06:52:38.104767 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:38 crc kubenswrapper[4789]: I1216 06:52:38.104965 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:38 crc kubenswrapper[4789]: E1216 06:52:38.105066 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:38 crc kubenswrapper[4789]: I1216 06:52:38.105045 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:38 crc kubenswrapper[4789]: E1216 06:52:38.105171 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:38 crc kubenswrapper[4789]: E1216 06:52:38.105196 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:38 crc kubenswrapper[4789]: I1216 06:52:38.801582 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:38 crc kubenswrapper[4789]: E1216 06:52:38.801738 4789 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:52:38 crc kubenswrapper[4789]: E1216 06:52:38.801797 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs podName:21ceea53-d8d0-48a9-8c27-5cdd1028f0b7 nodeName:}" failed. No retries permitted until 2025-12-16 06:53:42.801781295 +0000 UTC m=+161.063668924 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs") pod "network-metrics-daemon-ttcm5" (UID: "21ceea53-d8d0-48a9-8c27-5cdd1028f0b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 06:52:39 crc kubenswrapper[4789]: I1216 06:52:39.104012 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:39 crc kubenswrapper[4789]: E1216 06:52:39.104672 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:39 crc kubenswrapper[4789]: I1216 06:52:39.105294 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 06:52:39 crc kubenswrapper[4789]: E1216 06:52:39.105564 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:52:40 crc kubenswrapper[4789]: I1216 06:52:40.104638 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:40 crc kubenswrapper[4789]: I1216 06:52:40.104688 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:40 crc kubenswrapper[4789]: E1216 06:52:40.104815 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:40 crc kubenswrapper[4789]: I1216 06:52:40.104880 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:40 crc kubenswrapper[4789]: E1216 06:52:40.105129 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:40 crc kubenswrapper[4789]: E1216 06:52:40.105238 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:41 crc kubenswrapper[4789]: I1216 06:52:41.105068 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:41 crc kubenswrapper[4789]: E1216 06:52:41.105650 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:42 crc kubenswrapper[4789]: I1216 06:52:42.104219 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:42 crc kubenswrapper[4789]: I1216 06:52:42.104322 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:42 crc kubenswrapper[4789]: I1216 06:52:42.104322 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:42 crc kubenswrapper[4789]: E1216 06:52:42.105339 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:42 crc kubenswrapper[4789]: E1216 06:52:42.105500 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:42 crc kubenswrapper[4789]: E1216 06:52:42.105604 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:43 crc kubenswrapper[4789]: I1216 06:52:43.104777 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:43 crc kubenswrapper[4789]: E1216 06:52:43.104937 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:44 crc kubenswrapper[4789]: I1216 06:52:44.104166 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:44 crc kubenswrapper[4789]: I1216 06:52:44.104223 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:44 crc kubenswrapper[4789]: I1216 06:52:44.104179 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:44 crc kubenswrapper[4789]: E1216 06:52:44.104407 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:44 crc kubenswrapper[4789]: E1216 06:52:44.104621 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:44 crc kubenswrapper[4789]: E1216 06:52:44.105270 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:45 crc kubenswrapper[4789]: I1216 06:52:45.104795 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:45 crc kubenswrapper[4789]: E1216 06:52:45.105301 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:46 crc kubenswrapper[4789]: I1216 06:52:46.104747 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:46 crc kubenswrapper[4789]: E1216 06:52:46.104986 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:46 crc kubenswrapper[4789]: I1216 06:52:46.105143 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:46 crc kubenswrapper[4789]: E1216 06:52:46.105375 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:46 crc kubenswrapper[4789]: I1216 06:52:46.106215 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:46 crc kubenswrapper[4789]: E1216 06:52:46.106520 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:47 crc kubenswrapper[4789]: I1216 06:52:47.104338 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:47 crc kubenswrapper[4789]: E1216 06:52:47.104524 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:48 crc kubenswrapper[4789]: I1216 06:52:48.104731 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:48 crc kubenswrapper[4789]: E1216 06:52:48.105353 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:48 crc kubenswrapper[4789]: I1216 06:52:48.104753 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:48 crc kubenswrapper[4789]: E1216 06:52:48.105545 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:48 crc kubenswrapper[4789]: I1216 06:52:48.105660 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:48 crc kubenswrapper[4789]: E1216 06:52:48.105794 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:49 crc kubenswrapper[4789]: I1216 06:52:49.104756 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:49 crc kubenswrapper[4789]: E1216 06:52:49.105063 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:50 crc kubenswrapper[4789]: I1216 06:52:50.104850 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:50 crc kubenswrapper[4789]: I1216 06:52:50.104969 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:50 crc kubenswrapper[4789]: I1216 06:52:50.105067 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:50 crc kubenswrapper[4789]: E1216 06:52:50.105162 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:50 crc kubenswrapper[4789]: E1216 06:52:50.105291 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:50 crc kubenswrapper[4789]: E1216 06:52:50.105843 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:50 crc kubenswrapper[4789]: I1216 06:52:50.106467 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 06:52:50 crc kubenswrapper[4789]: E1216 06:52:50.106830 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:52:51 crc kubenswrapper[4789]: I1216 06:52:51.104149 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:51 crc kubenswrapper[4789]: E1216 06:52:51.104304 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:52 crc kubenswrapper[4789]: I1216 06:52:52.105091 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:52 crc kubenswrapper[4789]: I1216 06:52:52.105162 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:52 crc kubenswrapper[4789]: I1216 06:52:52.105034 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:52 crc kubenswrapper[4789]: E1216 06:52:52.105868 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:52 crc kubenswrapper[4789]: E1216 06:52:52.106101 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:52 crc kubenswrapper[4789]: E1216 06:52:52.106073 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:53 crc kubenswrapper[4789]: I1216 06:52:53.104876 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:53 crc kubenswrapper[4789]: E1216 06:52:53.105084 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.104599 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.104699 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:54 crc kubenswrapper[4789]: E1216 06:52:54.104865 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:54 crc kubenswrapper[4789]: E1216 06:52:54.105028 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.104748 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:54 crc kubenswrapper[4789]: E1216 06:52:54.105270 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.685630 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/1.log" Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.686346 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/0.log" Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.686421 4789 generic.go:334] "Generic (PLEG): container finished" podID="32431466-a255-4bf2-9237-4f48eab4a71e" containerID="9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a" exitCode=1 Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.686469 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerDied","Data":"9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a"} Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.686523 4789 scope.go:117] "RemoveContainer" containerID="771c995b9d10d871bb27edbd6d394937897f4a524b747e6275595a2a0ffda5d8" Dec 16 06:52:54 crc kubenswrapper[4789]: I1216 06:52:54.687189 4789 scope.go:117] "RemoveContainer" containerID="9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a" Dec 16 06:52:54 crc kubenswrapper[4789]: E1216 06:52:54.687537 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-58dsj_openshift-multus(32431466-a255-4bf2-9237-4f48eab4a71e)\"" pod="openshift-multus/multus-58dsj" podUID="32431466-a255-4bf2-9237-4f48eab4a71e" Dec 16 06:52:55 crc kubenswrapper[4789]: I1216 06:52:55.104827 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:55 crc kubenswrapper[4789]: E1216 06:52:55.105065 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:55 crc kubenswrapper[4789]: I1216 06:52:55.691274 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/1.log" Dec 16 06:52:56 crc kubenswrapper[4789]: I1216 06:52:56.105252 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:56 crc kubenswrapper[4789]: I1216 06:52:56.105255 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:56 crc kubenswrapper[4789]: E1216 06:52:56.105409 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:56 crc kubenswrapper[4789]: E1216 06:52:56.105528 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:56 crc kubenswrapper[4789]: I1216 06:52:56.105623 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:56 crc kubenswrapper[4789]: E1216 06:52:56.105706 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:57 crc kubenswrapper[4789]: I1216 06:52:57.104713 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:57 crc kubenswrapper[4789]: E1216 06:52:57.104970 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:52:58 crc kubenswrapper[4789]: I1216 06:52:58.104367 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:52:58 crc kubenswrapper[4789]: E1216 06:52:58.104529 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:52:58 crc kubenswrapper[4789]: I1216 06:52:58.104663 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:52:58 crc kubenswrapper[4789]: I1216 06:52:58.105059 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:52:58 crc kubenswrapper[4789]: E1216 06:52:58.106146 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:52:58 crc kubenswrapper[4789]: E1216 06:52:58.105885 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:52:59 crc kubenswrapper[4789]: I1216 06:52:59.104279 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:52:59 crc kubenswrapper[4789]: E1216 06:52:59.104388 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:00 crc kubenswrapper[4789]: I1216 06:53:00.104096 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:00 crc kubenswrapper[4789]: I1216 06:53:00.104163 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:00 crc kubenswrapper[4789]: I1216 06:53:00.104117 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:00 crc kubenswrapper[4789]: E1216 06:53:00.104277 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:00 crc kubenswrapper[4789]: E1216 06:53:00.104375 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:00 crc kubenswrapper[4789]: E1216 06:53:00.104446 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:01 crc kubenswrapper[4789]: I1216 06:53:01.104657 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:01 crc kubenswrapper[4789]: E1216 06:53:01.104815 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:02 crc kubenswrapper[4789]: E1216 06:53:02.049471 4789 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 16 06:53:02 crc kubenswrapper[4789]: I1216 06:53:02.103944 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:02 crc kubenswrapper[4789]: I1216 06:53:02.103980 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:02 crc kubenswrapper[4789]: E1216 06:53:02.104849 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:02 crc kubenswrapper[4789]: I1216 06:53:02.104867 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:02 crc kubenswrapper[4789]: E1216 06:53:02.105065 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:02 crc kubenswrapper[4789]: E1216 06:53:02.105111 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:02 crc kubenswrapper[4789]: E1216 06:53:02.179548 4789 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 06:53:03 crc kubenswrapper[4789]: I1216 06:53:03.104095 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:03 crc kubenswrapper[4789]: E1216 06:53:03.104252 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:04 crc kubenswrapper[4789]: I1216 06:53:04.104518 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:04 crc kubenswrapper[4789]: I1216 06:53:04.104595 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:04 crc kubenswrapper[4789]: E1216 06:53:04.104724 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:04 crc kubenswrapper[4789]: I1216 06:53:04.104846 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:04 crc kubenswrapper[4789]: E1216 06:53:04.104896 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:04 crc kubenswrapper[4789]: E1216 06:53:04.105171 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:05 crc kubenswrapper[4789]: I1216 06:53:05.105008 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:05 crc kubenswrapper[4789]: E1216 06:53:05.105207 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:05 crc kubenswrapper[4789]: I1216 06:53:05.105944 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 06:53:05 crc kubenswrapper[4789]: E1216 06:53:05.106157 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pbvfm_openshift-ovn-kubernetes(02a3f8b3-6393-4e58-9b49-506f85204b08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" Dec 16 06:53:06 crc kubenswrapper[4789]: I1216 06:53:06.104480 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:06 crc kubenswrapper[4789]: I1216 06:53:06.104480 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:06 crc kubenswrapper[4789]: I1216 06:53:06.104715 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:06 crc kubenswrapper[4789]: E1216 06:53:06.104633 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:06 crc kubenswrapper[4789]: E1216 06:53:06.104746 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:06 crc kubenswrapper[4789]: E1216 06:53:06.104846 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:07 crc kubenswrapper[4789]: I1216 06:53:07.104315 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:07 crc kubenswrapper[4789]: E1216 06:53:07.104437 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:07 crc kubenswrapper[4789]: E1216 06:53:07.180715 4789 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 06:53:08 crc kubenswrapper[4789]: I1216 06:53:08.104453 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:08 crc kubenswrapper[4789]: I1216 06:53:08.104541 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:08 crc kubenswrapper[4789]: I1216 06:53:08.104549 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:08 crc kubenswrapper[4789]: E1216 06:53:08.104619 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:08 crc kubenswrapper[4789]: E1216 06:53:08.104695 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:08 crc kubenswrapper[4789]: I1216 06:53:08.105098 4789 scope.go:117] "RemoveContainer" containerID="9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a" Dec 16 06:53:08 crc kubenswrapper[4789]: E1216 06:53:08.105123 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:08 crc kubenswrapper[4789]: I1216 06:53:08.742647 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/1.log" Dec 16 06:53:08 crc kubenswrapper[4789]: I1216 06:53:08.743182 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerStarted","Data":"bd434f3a0278709c2668ba4811723fb471cc6af28d94e7295ba888033dbe733f"} Dec 16 06:53:09 crc kubenswrapper[4789]: I1216 06:53:09.104473 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:09 crc kubenswrapper[4789]: E1216 06:53:09.104616 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:10 crc kubenswrapper[4789]: I1216 06:53:10.104950 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:10 crc kubenswrapper[4789]: I1216 06:53:10.104991 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:10 crc kubenswrapper[4789]: I1216 06:53:10.105039 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:10 crc kubenswrapper[4789]: E1216 06:53:10.105124 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:10 crc kubenswrapper[4789]: E1216 06:53:10.105239 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:10 crc kubenswrapper[4789]: E1216 06:53:10.105418 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:11 crc kubenswrapper[4789]: I1216 06:53:11.104006 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:11 crc kubenswrapper[4789]: E1216 06:53:11.104146 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:12 crc kubenswrapper[4789]: I1216 06:53:12.104196 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:12 crc kubenswrapper[4789]: I1216 06:53:12.104243 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:12 crc kubenswrapper[4789]: I1216 06:53:12.104145 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:12 crc kubenswrapper[4789]: E1216 06:53:12.105058 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:12 crc kubenswrapper[4789]: E1216 06:53:12.105219 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:12 crc kubenswrapper[4789]: E1216 06:53:12.105326 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:12 crc kubenswrapper[4789]: E1216 06:53:12.181141 4789 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 06:53:13 crc kubenswrapper[4789]: I1216 06:53:13.104349 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:13 crc kubenswrapper[4789]: E1216 06:53:13.104458 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:14 crc kubenswrapper[4789]: I1216 06:53:14.104854 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:14 crc kubenswrapper[4789]: I1216 06:53:14.105068 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:14 crc kubenswrapper[4789]: E1216 06:53:14.105207 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:14 crc kubenswrapper[4789]: I1216 06:53:14.105263 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:14 crc kubenswrapper[4789]: E1216 06:53:14.105453 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:14 crc kubenswrapper[4789]: E1216 06:53:14.105548 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:15 crc kubenswrapper[4789]: I1216 06:53:15.104049 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:15 crc kubenswrapper[4789]: E1216 06:53:15.104546 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:16 crc kubenswrapper[4789]: I1216 06:53:16.104117 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:16 crc kubenswrapper[4789]: I1216 06:53:16.104155 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:16 crc kubenswrapper[4789]: E1216 06:53:16.104251 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:16 crc kubenswrapper[4789]: I1216 06:53:16.104281 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:16 crc kubenswrapper[4789]: E1216 06:53:16.104427 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:16 crc kubenswrapper[4789]: E1216 06:53:16.104532 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:17 crc kubenswrapper[4789]: I1216 06:53:17.104876 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:17 crc kubenswrapper[4789]: E1216 06:53:17.105047 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:17 crc kubenswrapper[4789]: I1216 06:53:17.105711 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 06:53:17 crc kubenswrapper[4789]: E1216 06:53:17.182253 4789 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 06:53:17 crc kubenswrapper[4789]: I1216 06:53:17.777605 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/3.log" Dec 16 06:53:17 crc kubenswrapper[4789]: I1216 06:53:17.780026 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerStarted","Data":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} Dec 16 06:53:17 crc kubenswrapper[4789]: I1216 06:53:17.780403 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:53:17 crc kubenswrapper[4789]: I1216 06:53:17.804558 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podStartSLOduration=116.804511288 podStartE2EDuration="1m56.804511288s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:17.803502995 +0000 UTC m=+136.065390624" watchObservedRunningTime="2025-12-16 06:53:17.804511288 +0000 UTC m=+136.066398917" Dec 16 06:53:18 crc kubenswrapper[4789]: I1216 06:53:18.028218 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ttcm5"] Dec 16 06:53:18 crc kubenswrapper[4789]: I1216 06:53:18.028373 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:18 crc kubenswrapper[4789]: E1216 06:53:18.028472 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:18 crc kubenswrapper[4789]: I1216 06:53:18.104861 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:18 crc kubenswrapper[4789]: I1216 06:53:18.104968 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:18 crc kubenswrapper[4789]: E1216 06:53:18.105032 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:18 crc kubenswrapper[4789]: I1216 06:53:18.104861 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:18 crc kubenswrapper[4789]: E1216 06:53:18.105194 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:18 crc kubenswrapper[4789]: E1216 06:53:18.105282 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:20 crc kubenswrapper[4789]: I1216 06:53:20.104559 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:20 crc kubenswrapper[4789]: E1216 06:53:20.105024 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:20 crc kubenswrapper[4789]: I1216 06:53:20.104656 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:20 crc kubenswrapper[4789]: I1216 06:53:20.104626 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:20 crc kubenswrapper[4789]: E1216 06:53:20.105113 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:20 crc kubenswrapper[4789]: I1216 06:53:20.104723 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:20 crc kubenswrapper[4789]: E1216 06:53:20.105186 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:20 crc kubenswrapper[4789]: E1216 06:53:20.105252 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:22 crc kubenswrapper[4789]: I1216 06:53:22.104928 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:22 crc kubenswrapper[4789]: I1216 06:53:22.104960 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:22 crc kubenswrapper[4789]: E1216 06:53:22.105797 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 06:53:22 crc kubenswrapper[4789]: I1216 06:53:22.106026 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:22 crc kubenswrapper[4789]: I1216 06:53:22.106112 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:22 crc kubenswrapper[4789]: E1216 06:53:22.106195 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 06:53:22 crc kubenswrapper[4789]: E1216 06:53:22.106631 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 06:53:22 crc kubenswrapper[4789]: E1216 06:53:22.106844 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ttcm5" podUID="21ceea53-d8d0-48a9-8c27-5cdd1028f0b7" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.217950 4789 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.256964 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-slw4b"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.262457 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sg5st"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.262743 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.263052 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.263617 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.264727 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.265373 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.265749 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.266339 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.266386 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.266963 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.266502 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.266781 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.266806 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.267899 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.271136 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 06:53:23 crc kubenswrapper[4789]: W1216 06:53:23.271248 4789 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.271259 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: E1216 06:53:23.271287 4789 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.271334 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.271159 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.271174 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.271508 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: W1216 06:53:23.271213 4789 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Dec 16 06:53:23 crc kubenswrapper[4789]: E1216 06:53:23.271567 4789 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.271610 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 06:53:23 crc kubenswrapper[4789]: W1216 06:53:23.271727 4789 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-nl2j4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Dec 16 06:53:23 crc kubenswrapper[4789]: E1216 06:53:23.271813 4789 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-nl2j4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-nl2j4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 06:53:23 crc kubenswrapper[4789]: W1216 06:53:23.272062 4789 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.272087 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: W1216 06:53:23.272103 4789 reflector.go:561] object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Dec 16 06:53:23 crc kubenswrapper[4789]: E1216 06:53:23.272124 4789 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.272143 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: E1216 06:53:23.272088 4789 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 06:53:23 crc kubenswrapper[4789]: W1216 06:53:23.272263 4789 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-tls": failed to list *v1.Secret: secrets "machine-approver-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Dec 16 06:53:23 crc kubenswrapper[4789]: E1216 06:53:23.272277 4789 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.272405 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.274310 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.274493 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.274538 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.274577 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.277393 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.278046 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.285606 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.288549 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.288807 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.299692 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.320076 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.321001 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.321612 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.322359 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.323807 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.324126 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5786"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.324403 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.324722 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.324944 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.325292 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.325550 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.326744 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s4s66"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.327138 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.327540 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.327812 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329267 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-node-pullsecrets\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329318 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-audit-dir\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329351 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3419dec-7204-424c-8795-eb11d1b22316-config\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329386 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329408 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzz2h\" (UniqueName: \"kubernetes.io/projected/e3419dec-7204-424c-8795-eb11d1b22316-kube-api-access-xzz2h\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329428 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-etcd-serving-ca\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329447 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-config\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329475 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3419dec-7204-424c-8795-eb11d1b22316-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329493 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-audit\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329512 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-config\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329531 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-service-ca-bundle\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329554 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-image-import-ca\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329574 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329595 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5mmc\" (UniqueName: \"kubernetes.io/projected/2740e575-b6e0-470b-acd2-bf03614e7d35-kube-api-access-s5mmc\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329612 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-serving-cert\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329630 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7t9z\" (UniqueName: \"kubernetes.io/projected/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-kube-api-access-j7t9z\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329648 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2740e575-b6e0-470b-acd2-bf03614e7d35-serving-cert\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329665 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-encryption-config\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.329685 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-etcd-client\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.330728 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.337085 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.337799 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.337814 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.338438 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.338653 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.338876 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.338991 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.339219 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.339592 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.339631 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.339866 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.340002 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.340254 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.340298 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.340295 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.340975 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.341275 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.341668 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.341854 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.341908 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.342098 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.342241 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.342309 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.342543 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.344042 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lvrjn"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.344573 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.347556 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.347688 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.347792 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.347871 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.347942 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.348021 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.348140 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.348200 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.348234 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.348342 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.348392 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.347636 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.350178 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-66bwg"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.350221 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.350483 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.350578 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.350670 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-66bwg" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.353980 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fnph9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.354643 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.357379 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.357686 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.357720 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.357701 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.359385 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.359508 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.359626 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.359733 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.359817 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.360055 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.360149 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.360207 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.360330 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.384734 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pwj9t"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.385497 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.386719 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.391301 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.388139 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.388335 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.388515 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.388572 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.360336 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.389058 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.396850 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s57cr"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.397768 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.402062 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.402234 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.404245 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405079 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.404513 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405468 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.404578 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405226 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405262 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405262 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405276 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405298 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.409945 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.410549 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6ctc"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.410657 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.410023 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.410563 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405305 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405315 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405353 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405377 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405388 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.405401 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.407438 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.408438 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.412493 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.413051 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.414228 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mw7cs"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.414386 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.412618 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.414414 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.415755 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.416826 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-slw4b"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.417792 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.418251 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.425810 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431008 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431208 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzz2h\" (UniqueName: \"kubernetes.io/projected/e3419dec-7204-424c-8795-eb11d1b22316-kube-api-access-xzz2h\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431302 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-etcd-serving-ca\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431372 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-config\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431451 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3419dec-7204-424c-8795-eb11d1b22316-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431520 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-audit\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431591 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-config\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431658 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-service-ca-bundle\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431727 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-image-import-ca\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431801 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431881 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5mmc\" (UniqueName: \"kubernetes.io/projected/2740e575-b6e0-470b-acd2-bf03614e7d35-kube-api-access-s5mmc\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.431993 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-serving-cert\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432065 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7t9z\" (UniqueName: \"kubernetes.io/projected/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-kube-api-access-j7t9z\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432131 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2740e575-b6e0-470b-acd2-bf03614e7d35-serving-cert\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432202 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-encryption-config\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432275 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-etcd-client\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432340 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-node-pullsecrets\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432418 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa338783-6d00-4150-96d3-03ef1f28eb41-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hx7q5\" (UID: \"fa338783-6d00-4150-96d3-03ef1f28eb41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432501 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-audit-dir\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432573 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3419dec-7204-424c-8795-eb11d1b22316-config\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.432651 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9m2\" (UniqueName: \"kubernetes.io/projected/fa338783-6d00-4150-96d3-03ef1f28eb41-kube-api-access-rn9m2\") pod \"control-plane-machine-set-operator-78cbb6b69f-hx7q5\" (UID: \"fa338783-6d00-4150-96d3-03ef1f28eb41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.433714 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.434368 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-etcd-serving-ca\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.435634 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.436299 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-cbqb2"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.436640 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.437299 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.437844 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.438067 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.438450 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-config\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.439715 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-node-pullsecrets\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.440044 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-audit-dir\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.440618 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-config\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.441013 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-image-import-ca\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.441214 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2740e575-b6e0-470b-acd2-bf03614e7d35-service-ca-bundle\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.441573 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-audit\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.441572 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.441860 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.442502 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmw44"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.442969 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.443654 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.445327 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.446232 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.446333 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7ntql"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.447412 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-encryption-config\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.447997 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3419dec-7204-424c-8795-eb11d1b22316-config\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.448086 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.452135 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-etcd-client\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.452455 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3419dec-7204-424c-8795-eb11d1b22316-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.454195 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-serving-cert\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.454406 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2740e575-b6e0-470b-acd2-bf03614e7d35-serving-cert\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.458480 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hspwh"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.458980 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.459217 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.467265 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.471323 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.471983 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.472151 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-djgqn"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.472809 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.475982 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sg5st"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.476102 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.476171 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.485061 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.486163 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.488767 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.490466 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lvrjn"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.491600 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7fvzt"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.492324 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.492846 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.494099 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.495518 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.496645 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s57cr"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.497737 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.498932 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.500060 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s4s66"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.501251 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-66bwg"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.502890 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.503891 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.504353 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.506677 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.508204 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cbqb2"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.509350 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.510463 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.511782 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5786"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.513122 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6ctc"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.514400 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.515842 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.517868 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mw7cs"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.519212 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.521160 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-djgqn"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.524126 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-98ssf"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.525286 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.526046 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7stjw"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.527535 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hspwh"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.527680 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.531759 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fnph9"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.534113 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9m2\" (UniqueName: \"kubernetes.io/projected/fa338783-6d00-4150-96d3-03ef1f28eb41-kube-api-access-rn9m2\") pod \"control-plane-machine-set-operator-78cbb6b69f-hx7q5\" (UID: \"fa338783-6d00-4150-96d3-03ef1f28eb41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.534285 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa338783-6d00-4150-96d3-03ef1f28eb41-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hx7q5\" (UID: \"fa338783-6d00-4150-96d3-03ef1f28eb41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.534283 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.535722 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmw44"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.537095 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.539337 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.540812 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa338783-6d00-4150-96d3-03ef1f28eb41-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hx7q5\" (UID: \"fa338783-6d00-4150-96d3-03ef1f28eb41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.543663 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7ntql"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.545362 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-98ssf"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.545400 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.546451 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7stjw"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.547512 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ch664"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.548540 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.548576 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ch664"] Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.565357 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.585571 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.605315 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.626072 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.645623 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.666220 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.685475 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.705408 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.725326 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.745103 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.766603 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.786258 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.806020 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.825354 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.845634 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.866077 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.884799 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.905864 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.925743 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.944597 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.966203 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 06:53:23 crc kubenswrapper[4789]: I1216 06:53:23.986902 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.006741 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.026072 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.046811 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.076743 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.086650 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.104091 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.104131 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.104129 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.104125 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.107764 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.125656 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.145425 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.165569 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.186524 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.207383 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.224775 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.262740 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzz2h\" (UniqueName: \"kubernetes.io/projected/e3419dec-7204-424c-8795-eb11d1b22316-kube-api-access-xzz2h\") pod \"openshift-apiserver-operator-796bbdcf4f-68bzj\" (UID: \"e3419dec-7204-424c-8795-eb11d1b22316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.265949 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.292631 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.306419 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.342452 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7t9z\" (UniqueName: \"kubernetes.io/projected/c06d6dec-3c45-42f3-bd57-dece3f5dafe6-kube-api-access-j7t9z\") pod \"apiserver-76f77b778f-slw4b\" (UID: \"c06d6dec-3c45-42f3-bd57-dece3f5dafe6\") " pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.347510 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.365905 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.385382 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.405141 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.426444 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.443485 4789 request.go:700] Waited for 1.003345758s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/secrets?fieldSelector=metadata.name%3Dmetrics-tls&limit=500&resourceVersion=0 Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.445428 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.480674 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5mmc\" (UniqueName: \"kubernetes.io/projected/2740e575-b6e0-470b-acd2-bf03614e7d35-kube-api-access-s5mmc\") pod \"authentication-operator-69f744f599-sg5st\" (UID: \"2740e575-b6e0-470b-acd2-bf03614e7d35\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.485777 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.493700 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.505581 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.525172 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.525527 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.535428 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.556300 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.580402 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.587357 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.605415 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.627372 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.647134 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.666029 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.685063 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.705800 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.725301 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.729309 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-slw4b"] Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.744958 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.747169 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj"] Dec 16 06:53:24 crc kubenswrapper[4789]: W1216 06:53:24.754006 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3419dec_7204_424c_8795_eb11d1b22316.slice/crio-8a3503308f6e1d7473c4dc7f01c05f673fa0008a7b24d61c09cdb87d2530f385 WatchSource:0}: Error finding container 8a3503308f6e1d7473c4dc7f01c05f673fa0008a7b24d61c09cdb87d2530f385: Status 404 returned error can't find the container with id 8a3503308f6e1d7473c4dc7f01c05f673fa0008a7b24d61c09cdb87d2530f385 Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.763426 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sg5st"] Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.765048 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 06:53:24 crc kubenswrapper[4789]: W1216 06:53:24.769342 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2740e575_b6e0_470b_acd2_bf03614e7d35.slice/crio-16cad80388ea1b3a5ca1e1ea682c42fdf81c7c525e3eb566a2634bcad7a81f10 WatchSource:0}: Error finding container 16cad80388ea1b3a5ca1e1ea682c42fdf81c7c525e3eb566a2634bcad7a81f10: Status 404 returned error can't find the container with id 16cad80388ea1b3a5ca1e1ea682c42fdf81c7c525e3eb566a2634bcad7a81f10 Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.784459 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.801954 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" event={"ID":"2740e575-b6e0-470b-acd2-bf03614e7d35","Type":"ContainerStarted","Data":"16cad80388ea1b3a5ca1e1ea682c42fdf81c7c525e3eb566a2634bcad7a81f10"} Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.803071 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" event={"ID":"c06d6dec-3c45-42f3-bd57-dece3f5dafe6","Type":"ContainerStarted","Data":"72593a99b5473dcafdd3725a82b77214e583fd8a646c0d2a788413315fced670"} Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.804483 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" event={"ID":"e3419dec-7204-424c-8795-eb11d1b22316","Type":"ContainerStarted","Data":"8a3503308f6e1d7473c4dc7f01c05f673fa0008a7b24d61c09cdb87d2530f385"} Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.812254 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.824970 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.845838 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.865126 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.885209 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.905956 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.925699 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.945385 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.965752 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 06:53:24 crc kubenswrapper[4789]: I1216 06:53:24.985356 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.004739 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.025678 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.044603 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.065314 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.085242 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.106297 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.146086 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149513 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149547 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149571 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49e8732a-7f75-4b45-94d7-ad27168422b4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149590 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fq72\" (UniqueName: \"kubernetes.io/projected/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-kube-api-access-9fq72\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149610 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-service-ca\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149645 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-bound-sa-token\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149709 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjw9\" (UniqueName: \"kubernetes.io/projected/5221dd3a-57e8-43ff-ac08-62cbfc025419-kube-api-access-fbjw9\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149728 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149768 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88r4l\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-kube-api-access-88r4l\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149903 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e8732a-7f75-4b45-94d7-ad27168422b4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149940 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-config\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.149966 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150084 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-policies\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.150178 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:25.650157445 +0000 UTC m=+143.912045174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150248 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150282 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1038662-bd55-4e28-bd30-53e66f03ff85-machine-approver-tls\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150310 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-default-certificate\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150338 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jvs9x\" (UID: \"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150361 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150385 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-config\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150403 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150420 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e8732a-7f75-4b45-94d7-ad27168422b4-config\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150454 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150506 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9zmd\" (UniqueName: \"kubernetes.io/projected/51cf8011-671e-401b-ba7f-b062fa607e7f-kube-api-access-m9zmd\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150539 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150579 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150683 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-registry-certificates\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150796 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-client-ca\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150865 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150885 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbz56\" (UniqueName: \"kubernetes.io/projected/90244cab-89b7-4109-b673-a7cd881ae0a4-kube-api-access-sbz56\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150903 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be028739-1351-4883-95ec-35fb89831c72-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150940 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktjc\" (UniqueName: \"kubernetes.io/projected/d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f-kube-api-access-gktjc\") pod \"cluster-samples-operator-665b6dd947-jvs9x\" (UID: \"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150958 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqzt9\" (UniqueName: \"kubernetes.io/projected/f1038662-bd55-4e28-bd30-53e66f03ff85-kube-api-access-tqzt9\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150975 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-config\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.150992 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkdcv\" (UniqueName: \"kubernetes.io/projected/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-kube-api-access-tkdcv\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151028 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx4wm\" (UniqueName: \"kubernetes.io/projected/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-kube-api-access-rx4wm\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151044 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqlns\" (UniqueName: \"kubernetes.io/projected/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-kube-api-access-tqlns\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151064 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-registry-tls\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151081 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151101 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/90244cab-89b7-4109-b673-a7cd881ae0a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151166 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-client\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151208 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-metrics-certs\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151234 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-serving-cert\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151256 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1038662-bd55-4e28-bd30-53e66f03ff85-auth-proxy-config\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151300 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151351 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-trusted-ca\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151385 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be028739-1351-4883-95ec-35fb89831c72-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151445 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5221dd3a-57e8-43ff-ac08-62cbfc025419-service-ca-bundle\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151471 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151495 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-proxy-tls\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151518 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151540 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151565 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqrs\" (UniqueName: \"kubernetes.io/projected/fa494689-eaaa-455c-ba63-2f6a295d5a27-kube-api-access-prqrs\") pod \"downloads-7954f5f757-66bwg\" (UID: \"fa494689-eaaa-455c-ba63-2f6a295d5a27\") " pod="openshift-console/downloads-7954f5f757-66bwg" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151589 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151629 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-stats-auth\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151665 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151688 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cf8011-671e-401b-ba7f-b062fa607e7f-serving-cert\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151709 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-config\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151735 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-images\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151790 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90244cab-89b7-4109-b673-a7cd881ae0a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151813 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1038662-bd55-4e28-bd30-53e66f03ff85-config\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151850 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfml8\" (UniqueName: \"kubernetes.io/projected/32383d71-3226-46ea-9d69-c3ab1096ec2c-kube-api-access-tfml8\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151892 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151943 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr79q\" (UniqueName: \"kubernetes.io/projected/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-kube-api-access-hr79q\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.151973 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-ca\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.152015 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-dir\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.165702 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.186012 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.225444 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.246166 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.253253 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.253403 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:25.753379958 +0000 UTC m=+144.015267587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.253554 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-config\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.254407 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-config\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.254461 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.254491 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed831a75-86f6-4f98-a6f2-63e45e6f051b-apiservice-cert\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.254520 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255312 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-serving-cert\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255342 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9zmd\" (UniqueName: \"kubernetes.io/projected/51cf8011-671e-401b-ba7f-b062fa607e7f-kube-api-access-m9zmd\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255362 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255379 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-config\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255399 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255422 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255453 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255474 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc6f123f-a6d4-4451-bdb5-82286e190c55-audit-dir\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255494 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-registry-certificates\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255511 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-config\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255525 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-client-ca\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255540 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/120a166b-9fed-4921-940d-6c43c0a145c0-srv-cert\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255568 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f22acd3-56c4-42b8-badc-1239c0050781-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255615 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6e8c5a-b937-4072-902d-28e056de16d2-config-volume\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255672 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41ef34b6-bb16-4602-a6b5-40597c0dc211-bound-sa-token\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255697 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gktjc\" (UniqueName: \"kubernetes.io/projected/d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f-kube-api-access-gktjc\") pod \"cluster-samples-operator-665b6dd947-jvs9x\" (UID: \"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255718 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqzt9\" (UniqueName: \"kubernetes.io/projected/f1038662-bd55-4e28-bd30-53e66f03ff85-kube-api-access-tqzt9\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-srv-cert\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255760 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8f22acd3-56c4-42b8-badc-1239c0050781-images\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255783 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkdcv\" (UniqueName: \"kubernetes.io/projected/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-kube-api-access-tkdcv\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255806 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltcdg\" (UniqueName: \"kubernetes.io/projected/9b88cb44-5b2f-4838-bddd-7b3b17ebb629-kube-api-access-ltcdg\") pod \"dns-operator-744455d44c-mw7cs\" (UID: \"9b88cb44-5b2f-4838-bddd-7b3b17ebb629\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255834 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvksw\" (UniqueName: \"kubernetes.io/projected/f651303c-cd90-4d8c-92c4-519a02627eb5-kube-api-access-rvksw\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255856 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6h8\" (UniqueName: \"kubernetes.io/projected/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-kube-api-access-nj6h8\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255878 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255955 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-metrics-certs\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.255990 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-serving-cert\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256008 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1038662-bd55-4e28-bd30-53e66f03ff85-auth-proxy-config\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256029 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed831a75-86f6-4f98-a6f2-63e45e6f051b-tmpfs\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256045 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-trusted-ca\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256073 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjl5l\" (UniqueName: \"kubernetes.io/projected/41ef34b6-bb16-4602-a6b5-40597c0dc211-kube-api-access-jjl5l\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256099 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256117 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-proxy-tls\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256143 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxlp\" (UniqueName: \"kubernetes.io/projected/ed831a75-86f6-4f98-a6f2-63e45e6f051b-kube-api-access-7cxlp\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256164 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256187 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256203 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqrs\" (UniqueName: \"kubernetes.io/projected/fa494689-eaaa-455c-ba63-2f6a295d5a27-kube-api-access-prqrs\") pod \"downloads-7954f5f757-66bwg\" (UID: \"fa494689-eaaa-455c-ba63-2f6a295d5a27\") " pod="openshift-console/downloads-7954f5f757-66bwg" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256219 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5c72d0-500f-4d04-9a3c-76d815541c0a-config\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256234 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-service-ca\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256253 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c27456e8-bb86-45c4-b482-dfc01d73f4b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-djgqn\" (UID: \"c27456e8-bb86-45c4-b482-dfc01d73f4b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256269 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r6cr\" (UniqueName: \"kubernetes.io/projected/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-kube-api-access-5r6cr\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256285 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41ef34b6-bb16-4602-a6b5-40597c0dc211-trusted-ca\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256309 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfml8\" (UniqueName: \"kubernetes.io/projected/32383d71-3226-46ea-9d69-c3ab1096ec2c-kube-api-access-tfml8\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256326 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90244cab-89b7-4109-b673-a7cd881ae0a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256345 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/658ac624-7f09-4d74-bd73-8b00a997847f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256362 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f651303c-cd90-4d8c-92c4-519a02627eb5-signing-cabundle\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256381 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256398 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49e8732a-7f75-4b45-94d7-ad27168422b4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256414 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kmn\" (UniqueName: \"kubernetes.io/projected/c27456e8-bb86-45c4-b482-dfc01d73f4b5-kube-api-access-67kmn\") pod \"multus-admission-controller-857f4d67dd-djgqn\" (UID: \"c27456e8-bb86-45c4-b482-dfc01d73f4b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256431 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-oauth-serving-cert\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256447 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcr5q\" (UniqueName: \"kubernetes.io/projected/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-kube-api-access-mcr5q\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256469 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fq72\" (UniqueName: \"kubernetes.io/projected/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-kube-api-access-9fq72\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256490 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256506 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256524 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5lc\" (UniqueName: \"kubernetes.io/projected/2cbecadb-0f2a-443e-b065-edc627985d96-kube-api-access-jp5lc\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256542 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88r4l\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-kube-api-access-88r4l\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256559 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e8732a-7f75-4b45-94d7-ad27168422b4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256582 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-config\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256599 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256615 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-serving-cert\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256634 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-policies\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256649 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5sv\" (UniqueName: \"kubernetes.io/projected/4e6e8c5a-b937-4072-902d-28e056de16d2-kube-api-access-sp5sv\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256667 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256684 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9m89\" (UniqueName: \"kubernetes.io/projected/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-kube-api-access-j9m89\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256714 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq5m8\" (UniqueName: \"kubernetes.io/projected/84194086-c0d5-40d8-930d-a83c50b7dd3f-kube-api-access-pq5m8\") pod \"migrator-59844c95c7-thjv9\" (UID: \"84194086-c0d5-40d8-930d-a83c50b7dd3f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256730 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-trusted-ca-bundle\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256745 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6e8c5a-b937-4072-902d-28e056de16d2-secret-volume\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256764 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-etcd-client\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256784 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1038662-bd55-4e28-bd30-53e66f03ff85-machine-approver-tls\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256800 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256806 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256804 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-registry-certificates\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256817 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.256905 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/658ac624-7f09-4d74-bd73-8b00a997847f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.257232 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:25.757217507 +0000 UTC m=+144.019105216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257386 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-policies\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257448 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-trusted-ca\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257478 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aebb0644-d181-4a38-94d6-5885ca2058ee-cert\") pod \"ingress-canary-ch664\" (UID: \"aebb0644-d181-4a38-94d6-5885ca2058ee\") " pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257526 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257553 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e8732a-7f75-4b45-94d7-ad27168422b4-config\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257581 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257606 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257620 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-config\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257631 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f22acd3-56c4-42b8-badc-1239c0050781-proxy-tls\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257633 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-trusted-ca\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257719 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b88cb44-5b2f-4838-bddd-7b3b17ebb629-metrics-tls\") pod \"dns-operator-744455d44c-mw7cs\" (UID: \"9b88cb44-5b2f-4838-bddd-7b3b17ebb629\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257747 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-serving-cert\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257779 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-client-ca\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257784 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257806 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbz56\" (UniqueName: \"kubernetes.io/projected/90244cab-89b7-4109-b673-a7cd881ae0a4-kube-api-access-sbz56\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257833 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-config-volume\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257861 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be028739-1351-4883-95ec-35fb89831c72-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257885 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpkc8\" (UniqueName: \"kubernetes.io/projected/d5615d08-80d1-4209-9cc4-c9b27e1ac024-kube-api-access-dpkc8\") pod \"package-server-manager-789f6589d5-ptbc9\" (UID: \"d5615d08-80d1-4209-9cc4-c9b27e1ac024\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.257933 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx4wm\" (UniqueName: \"kubernetes.io/projected/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-kube-api-access-rx4wm\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.258711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.258770 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be028739-1351-4883-95ec-35fb89831c72-ca-trust-extracted\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.259157 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e8732a-7f75-4b45-94d7-ad27168422b4-config\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.259276 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-client-ca\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.259551 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-config\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.259604 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqlns\" (UniqueName: \"kubernetes.io/projected/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-kube-api-access-tqlns\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.259629 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-client\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.259672 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-registry-tls\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260167 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-config\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260272 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260477 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260542 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/90244cab-89b7-4109-b673-a7cd881ae0a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260593 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260621 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-socket-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260649 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-serving-cert\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260691 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5g4\" (UniqueName: \"kubernetes.io/projected/cce118b7-47b5-499b-9cc6-e5e24ba1c317-kube-api-access-fw5g4\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260728 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be028739-1351-4883-95ec-35fb89831c72-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260794 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260857 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/90244cab-89b7-4109-b673-a7cd881ae0a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260848 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cbecadb-0f2a-443e-b065-edc627985d96-serving-cert\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260894 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtnng\" (UniqueName: \"kubernetes.io/projected/28e992ee-e81f-46d7-b422-27fa3023b7d8-kube-api-access-dtnng\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.260967 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkdxq\" (UniqueName: \"kubernetes.io/projected/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-kube-api-access-jkdxq\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261167 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5221dd3a-57e8-43ff-ac08-62cbfc025419-service-ca-bundle\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261202 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgbhs\" (UniqueName: \"kubernetes.io/projected/aebb0644-d181-4a38-94d6-5885ca2058ee-kube-api-access-jgbhs\") pod \"ingress-canary-ch664\" (UID: \"aebb0644-d181-4a38-94d6-5885ca2058ee\") " pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261228 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261251 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed831a75-86f6-4f98-a6f2-63e45e6f051b-webhook-cert\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261271 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-config\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261303 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-stats-auth\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261331 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261368 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-audit-policies\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261390 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-images\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261412 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cf8011-671e-401b-ba7f-b062fa607e7f-serving-cert\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261432 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-config\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261463 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5615d08-80d1-4209-9cc4-c9b27e1ac024-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ptbc9\" (UID: \"d5615d08-80d1-4209-9cc4-c9b27e1ac024\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261484 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cce118b7-47b5-499b-9cc6-e5e24ba1c317-node-bootstrap-token\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261500 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261505 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1038662-bd55-4e28-bd30-53e66f03ff85-config\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261563 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5c72d0-500f-4d04-9a3c-76d815541c0a-serving-cert\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.261789 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262082 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5221dd3a-57e8-43ff-ac08-62cbfc025419-service-ca-bundle\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262209 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/120a166b-9fed-4921-940d-6c43c0a145c0-profile-collector-cert\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262245 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-encryption-config\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262277 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262326 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr79q\" (UniqueName: \"kubernetes.io/projected/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-kube-api-access-hr79q\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262532 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmtr\" (UniqueName: \"kubernetes.io/projected/cc6f123f-a6d4-4451-bdb5-82286e190c55-kube-api-access-hbmtr\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262543 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262569 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-registration-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262599 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-csi-data-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262638 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-dir\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262687 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-ca\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262716 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-metrics-certs\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262709 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-898pl\" (UniqueName: \"kubernetes.io/projected/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-kube-api-access-898pl\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262758 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-dir\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262770 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f651303c-cd90-4d8c-92c4-519a02627eb5-signing-key\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262800 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41ef34b6-bb16-4602-a6b5-40597c0dc211-metrics-tls\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262831 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262856 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-mountpoint-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262879 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cce118b7-47b5-499b-9cc6-e5e24ba1c317-certs\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262935 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-bound-sa-token\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.262970 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-config\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263019 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-service-ca\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263063 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjw9\" (UniqueName: \"kubernetes.io/projected/5221dd3a-57e8-43ff-ac08-62cbfc025419-kube-api-access-fbjw9\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263093 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5cc4\" (UniqueName: \"kubernetes.io/projected/ee5c72d0-500f-4d04-9a3c-76d815541c0a-kube-api-access-x5cc4\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263115 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-plugins-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263135 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lphz\" (UniqueName: \"kubernetes.io/projected/8f22acd3-56c4-42b8-badc-1239c0050781-kube-api-access-4lphz\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263169 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-oauth-config\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263193 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-metrics-tls\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263213 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ac624-7f09-4d74-bd73-8b00a997847f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263235 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnzz6\" (UniqueName: \"kubernetes.io/projected/120a166b-9fed-4921-940d-6c43c0a145c0-kube-api-access-mnzz6\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263262 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263287 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-default-certificate\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263310 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jvs9x\" (UID: \"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263437 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-ca\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263535 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263691 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-service-ca\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.263708 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.264168 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90244cab-89b7-4109-b673-a7cd881ae0a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.264707 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-proxy-tls\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.264745 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.264984 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-images\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.265080 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.265379 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.265682 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.265726 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.265732 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cf8011-671e-401b-ba7f-b062fa607e7f-serving-cert\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.265835 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be028739-1351-4883-95ec-35fb89831c72-installation-pull-secrets\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.266044 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.266428 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.266567 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49e8732a-7f75-4b45-94d7-ad27168422b4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.267164 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.267175 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-registry-tls\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.267428 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.267847 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jvs9x\" (UID: \"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.267962 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-default-certificate\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.267985 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5221dd3a-57e8-43ff-ac08-62cbfc025419-stats-auth\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.268394 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.268618 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51cf8011-671e-401b-ba7f-b062fa607e7f-etcd-client\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.285276 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.306046 4789 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.326750 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.361201 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9m2\" (UniqueName: \"kubernetes.io/projected/fa338783-6d00-4150-96d3-03ef1f28eb41-kube-api-access-rn9m2\") pod \"control-plane-machine-set-operator-78cbb6b69f-hx7q5\" (UID: \"fa338783-6d00-4150-96d3-03ef1f28eb41\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364064 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.364157 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:25.864135215 +0000 UTC m=+144.126022844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364286 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjl5l\" (UniqueName: \"kubernetes.io/projected/41ef34b6-bb16-4602-a6b5-40597c0dc211-kube-api-access-jjl5l\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364312 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxlp\" (UniqueName: \"kubernetes.io/projected/ed831a75-86f6-4f98-a6f2-63e45e6f051b-kube-api-access-7cxlp\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364332 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5c72d0-500f-4d04-9a3c-76d815541c0a-config\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364356 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c27456e8-bb86-45c4-b482-dfc01d73f4b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-djgqn\" (UID: \"c27456e8-bb86-45c4-b482-dfc01d73f4b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364372 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-service-ca\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364392 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r6cr\" (UniqueName: \"kubernetes.io/projected/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-kube-api-access-5r6cr\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364409 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41ef34b6-bb16-4602-a6b5-40597c0dc211-trusted-ca\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364426 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/658ac624-7f09-4d74-bd73-8b00a997847f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364458 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f651303c-cd90-4d8c-92c4-519a02627eb5-signing-cabundle\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364475 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67kmn\" (UniqueName: \"kubernetes.io/projected/c27456e8-bb86-45c4-b482-dfc01d73f4b5-kube-api-access-67kmn\") pod \"multus-admission-controller-857f4d67dd-djgqn\" (UID: \"c27456e8-bb86-45c4-b482-dfc01d73f4b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364491 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-oauth-serving-cert\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364508 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcr5q\" (UniqueName: \"kubernetes.io/projected/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-kube-api-access-mcr5q\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364539 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5lc\" (UniqueName: \"kubernetes.io/projected/2cbecadb-0f2a-443e-b065-edc627985d96-kube-api-access-jp5lc\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364560 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364587 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5sv\" (UniqueName: \"kubernetes.io/projected/4e6e8c5a-b937-4072-902d-28e056de16d2-kube-api-access-sp5sv\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364602 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364619 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-serving-cert\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364638 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9m89\" (UniqueName: \"kubernetes.io/projected/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-kube-api-access-j9m89\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364655 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq5m8\" (UniqueName: \"kubernetes.io/projected/84194086-c0d5-40d8-930d-a83c50b7dd3f-kube-api-access-pq5m8\") pod \"migrator-59844c95c7-thjv9\" (UID: \"84194086-c0d5-40d8-930d-a83c50b7dd3f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364671 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6e8c5a-b937-4072-902d-28e056de16d2-secret-volume\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364688 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-etcd-client\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364704 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-trusted-ca-bundle\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364720 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364743 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-trusted-ca\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364759 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aebb0644-d181-4a38-94d6-5885ca2058ee-cert\") pod \"ingress-canary-ch664\" (UID: \"aebb0644-d181-4a38-94d6-5885ca2058ee\") " pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364776 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/658ac624-7f09-4d74-bd73-8b00a997847f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364792 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364807 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f22acd3-56c4-42b8-badc-1239c0050781-proxy-tls\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364825 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364841 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b88cb44-5b2f-4838-bddd-7b3b17ebb629-metrics-tls\") pod \"dns-operator-744455d44c-mw7cs\" (UID: \"9b88cb44-5b2f-4838-bddd-7b3b17ebb629\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364857 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-serving-cert\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364878 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-config-volume\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364896 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpkc8\" (UniqueName: \"kubernetes.io/projected/d5615d08-80d1-4209-9cc4-c9b27e1ac024-kube-api-access-dpkc8\") pod \"package-server-manager-789f6589d5-ptbc9\" (UID: \"d5615d08-80d1-4209-9cc4-c9b27e1ac024\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364952 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364968 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-socket-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.364985 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5g4\" (UniqueName: \"kubernetes.io/projected/cce118b7-47b5-499b-9cc6-e5e24ba1c317-kube-api-access-fw5g4\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365000 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cbecadb-0f2a-443e-b065-edc627985d96-serving-cert\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365023 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkdxq\" (UniqueName: \"kubernetes.io/projected/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-kube-api-access-jkdxq\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365042 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtnng\" (UniqueName: \"kubernetes.io/projected/28e992ee-e81f-46d7-b422-27fa3023b7d8-kube-api-access-dtnng\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365044 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5c72d0-500f-4d04-9a3c-76d815541c0a-config\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365063 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgbhs\" (UniqueName: \"kubernetes.io/projected/aebb0644-d181-4a38-94d6-5885ca2058ee-kube-api-access-jgbhs\") pod \"ingress-canary-ch664\" (UID: \"aebb0644-d181-4a38-94d6-5885ca2058ee\") " pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365093 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed831a75-86f6-4f98-a6f2-63e45e6f051b-webhook-cert\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365113 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-config\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365137 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-audit-policies\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365176 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5615d08-80d1-4209-9cc4-c9b27e1ac024-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ptbc9\" (UID: \"d5615d08-80d1-4209-9cc4-c9b27e1ac024\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365200 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cce118b7-47b5-499b-9cc6-e5e24ba1c317-node-bootstrap-token\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365223 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-encryption-config\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365252 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5c72d0-500f-4d04-9a3c-76d815541c0a-serving-cert\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365275 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/120a166b-9fed-4921-940d-6c43c0a145c0-profile-collector-cert\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365296 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmtr\" (UniqueName: \"kubernetes.io/projected/cc6f123f-a6d4-4451-bdb5-82286e190c55-kube-api-access-hbmtr\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365317 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-registration-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365337 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-csi-data-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365373 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-898pl\" (UniqueName: \"kubernetes.io/projected/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-kube-api-access-898pl\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365397 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f651303c-cd90-4d8c-92c4-519a02627eb5-signing-key\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365420 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41ef34b6-bb16-4602-a6b5-40597c0dc211-metrics-tls\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365450 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-mountpoint-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365476 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cce118b7-47b5-499b-9cc6-e5e24ba1c317-certs\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365510 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5cc4\" (UniqueName: \"kubernetes.io/projected/ee5c72d0-500f-4d04-9a3c-76d815541c0a-kube-api-access-x5cc4\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365529 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-plugins-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365562 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lphz\" (UniqueName: \"kubernetes.io/projected/8f22acd3-56c4-42b8-badc-1239c0050781-kube-api-access-4lphz\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365599 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-oauth-config\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365622 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ac624-7f09-4d74-bd73-8b00a997847f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365641 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-metrics-tls\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365662 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnzz6\" (UniqueName: \"kubernetes.io/projected/120a166b-9fed-4921-940d-6c43c0a145c0-kube-api-access-mnzz6\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed831a75-86f6-4f98-a6f2-63e45e6f051b-apiservice-cert\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365708 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365727 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-serving-cert\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365748 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365769 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-config\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365784 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365799 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc6f123f-a6d4-4451-bdb5-82286e190c55-audit-dir\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365814 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-client-ca\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365827 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/120a166b-9fed-4921-940d-6c43c0a145c0-srv-cert\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365843 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f22acd3-56c4-42b8-badc-1239c0050781-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365865 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-config\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365886 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6e8c5a-b937-4072-902d-28e056de16d2-config-volume\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365949 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-srv-cert\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365979 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8f22acd3-56c4-42b8-badc-1239c0050781-images\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.365995 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41ef34b6-bb16-4602-a6b5-40597c0dc211-bound-sa-token\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.366017 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltcdg\" (UniqueName: \"kubernetes.io/projected/9b88cb44-5b2f-4838-bddd-7b3b17ebb629-kube-api-access-ltcdg\") pod \"dns-operator-744455d44c-mw7cs\" (UID: \"9b88cb44-5b2f-4838-bddd-7b3b17ebb629\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.366032 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvksw\" (UniqueName: \"kubernetes.io/projected/f651303c-cd90-4d8c-92c4-519a02627eb5-kube-api-access-rvksw\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.366047 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6h8\" (UniqueName: \"kubernetes.io/projected/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-kube-api-access-nj6h8\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.366066 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed831a75-86f6-4f98-a6f2-63e45e6f051b-tmpfs\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.366080 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.366498 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41ef34b6-bb16-4602-a6b5-40597c0dc211-trusted-ca\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.366587 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.367481 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-trusted-ca\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.367612 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:25.867594065 +0000 UTC m=+144.129481694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.367808 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-config\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.368095 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.368223 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.368282 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-service-ca\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.368539 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f651303c-cd90-4d8c-92c4-519a02627eb5-signing-cabundle\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.368652 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c27456e8-bb86-45c4-b482-dfc01d73f4b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-djgqn\" (UID: \"c27456e8-bb86-45c4-b482-dfc01d73f4b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.368858 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc6f123f-a6d4-4451-bdb5-82286e190c55-audit-policies\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.369182 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f22acd3-56c4-42b8-badc-1239c0050781-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.370892 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6e8c5a-b937-4072-902d-28e056de16d2-config-volume\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.371119 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-config\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.371592 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.371774 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-oauth-serving-cert\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.372398 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed831a75-86f6-4f98-a6f2-63e45e6f051b-webhook-cert\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.372864 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/658ac624-7f09-4d74-bd73-8b00a997847f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.373213 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5615d08-80d1-4209-9cc4-c9b27e1ac024-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ptbc9\" (UID: \"d5615d08-80d1-4209-9cc4-c9b27e1ac024\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.373289 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-encryption-config\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.373399 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-serving-cert\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.373627 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6e8c5a-b937-4072-902d-28e056de16d2-secret-volume\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.374187 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-srv-cert\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.374932 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-config\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.375571 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8f22acd3-56c4-42b8-badc-1239c0050781-images\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.376345 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-plugins-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.376391 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/cce118b7-47b5-499b-9cc6-e5e24ba1c317-node-bootstrap-token\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.376865 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-oauth-config\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.376893 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.376960 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-mountpoint-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.377038 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc6f123f-a6d4-4451-bdb5-82286e190c55-audit-dir\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.377086 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/cce118b7-47b5-499b-9cc6-e5e24ba1c317-certs\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.377442 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5c72d0-500f-4d04-9a3c-76d815541c0a-serving-cert\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.377800 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ed831a75-86f6-4f98-a6f2-63e45e6f051b-tmpfs\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.378028 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-serving-cert\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.378617 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379002 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-metrics-tls\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379377 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ac624-7f09-4d74-bd73-8b00a997847f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379460 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379659 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-client-ca\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379695 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f22acd3-56c4-42b8-badc-1239c0050781-proxy-tls\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379830 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-serving-cert\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379903 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-registration-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.379977 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-socket-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.380485 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.380661 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-csi-data-dir\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.380882 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc6f123f-a6d4-4451-bdb5-82286e190c55-etcd-client\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.381052 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-trusted-ca-bundle\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.381201 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cbecadb-0f2a-443e-b065-edc627985d96-serving-cert\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.381345 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-config-volume\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.381482 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed831a75-86f6-4f98-a6f2-63e45e6f051b-apiservice-cert\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.381569 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f651303c-cd90-4d8c-92c4-519a02627eb5-signing-key\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.381789 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b88cb44-5b2f-4838-bddd-7b3b17ebb629-metrics-tls\") pod \"dns-operator-744455d44c-mw7cs\" (UID: \"9b88cb44-5b2f-4838-bddd-7b3b17ebb629\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.381908 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41ef34b6-bb16-4602-a6b5-40597c0dc211-metrics-tls\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.382857 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/120a166b-9fed-4921-940d-6c43c0a145c0-srv-cert\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.383498 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/120a166b-9fed-4921-940d-6c43c0a145c0-profile-collector-cert\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.385057 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.385804 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.389817 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aebb0644-d181-4a38-94d6-5885ca2058ee-cert\") pod \"ingress-canary-ch664\" (UID: \"aebb0644-d181-4a38-94d6-5885ca2058ee\") " pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.406180 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.425406 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.443721 4789 request.go:700] Waited for 1.339253699s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&limit=500&resourceVersion=0 Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.445309 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.465702 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.467203 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.467446 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:25.967419269 +0000 UTC m=+144.229306898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.467615 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.468084 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:25.968070034 +0000 UTC m=+144.229957743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.492857 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.505979 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.525719 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.548452 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.549472 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.565060 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.568967 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.569458 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.069443135 +0000 UTC m=+144.331330764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.572796 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1038662-bd55-4e28-bd30-53e66f03ff85-machine-approver-tls\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.586084 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.586752 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1038662-bd55-4e28-bd30-53e66f03ff85-auth-proxy-config\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.605362 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.625692 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.644900 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.652422 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1038662-bd55-4e28-bd30-53e66f03ff85-config\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.666091 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.670281 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.670760 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.170749994 +0000 UTC m=+144.432637623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.717086 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5"] Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.727199 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktjc\" (UniqueName: \"kubernetes.io/projected/d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f-kube-api-access-gktjc\") pod \"cluster-samples-operator-665b6dd947-jvs9x\" (UID: \"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.741130 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqzt9\" (UniqueName: \"kubernetes.io/projected/f1038662-bd55-4e28-bd30-53e66f03ff85-kube-api-access-tqzt9\") pod \"machine-approver-56656f9798-g8md7\" (UID: \"f1038662-bd55-4e28-bd30-53e66f03ff85\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.748122 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.770705 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkdcv\" (UniqueName: \"kubernetes.io/projected/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-kube-api-access-tkdcv\") pod \"route-controller-manager-6576b87f9c-kpcbh\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.771007 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.771150 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.27104724 +0000 UTC m=+144.532934869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.771395 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.771993 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.271982171 +0000 UTC m=+144.533869790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.780574 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49e8732a-7f75-4b45-94d7-ad27168422b4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wxvc4\" (UID: \"49e8732a-7f75-4b45-94d7-ad27168422b4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.792345 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.804017 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqrs\" (UniqueName: \"kubernetes.io/projected/fa494689-eaaa-455c-ba63-2f6a295d5a27-kube-api-access-prqrs\") pod \"downloads-7954f5f757-66bwg\" (UID: \"fa494689-eaaa-455c-ba63-2f6a295d5a27\") " pod="openshift-console/downloads-7954f5f757-66bwg" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.808460 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.821469 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fq72\" (UniqueName: \"kubernetes.io/projected/8b2c6c23-962c-4829-bd8a-088c7c63dfa4-kube-api-access-9fq72\") pod \"machine-api-operator-5694c8668f-fnph9\" (UID: \"8b2c6c23-962c-4829-bd8a-088c7c63dfa4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.824744 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" event={"ID":"fa338783-6d00-4150-96d3-03ef1f28eb41","Type":"ContainerStarted","Data":"dca0a0c3ac61ef238e2f3cf22cfa8186349b69f754f8859c12d6f6537a3adaa9"} Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.828195 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" event={"ID":"f1038662-bd55-4e28-bd30-53e66f03ff85","Type":"ContainerStarted","Data":"390a9ae88a1fd9076e0edff622b098051890d507649febadea1e7cd6836d7bd3"} Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.830248 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" event={"ID":"2740e575-b6e0-470b-acd2-bf03614e7d35","Type":"ContainerStarted","Data":"b6bc9830ee5007855b439fce844a7a583c966cf812badc00d43e47bfa4ced74f"} Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.832898 4789 generic.go:334] "Generic (PLEG): container finished" podID="c06d6dec-3c45-42f3-bd57-dece3f5dafe6" containerID="1b4656ab868afb10f85bcab38425ed3b51a4ba5a39144aa6971f0c8e895b1c73" exitCode=0 Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.833028 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" event={"ID":"c06d6dec-3c45-42f3-bd57-dece3f5dafe6","Type":"ContainerDied","Data":"1b4656ab868afb10f85bcab38425ed3b51a4ba5a39144aa6971f0c8e895b1c73"} Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.835608 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" event={"ID":"e3419dec-7204-424c-8795-eb11d1b22316","Type":"ContainerStarted","Data":"edf482088e27329ef9de8ef4414425ed1cf843968779da55b01609141d2e9dde"} Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.841693 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfml8\" (UniqueName: \"kubernetes.io/projected/32383d71-3226-46ea-9d69-c3ab1096ec2c-kube-api-access-tfml8\") pod \"oauth-openshift-558db77b4-s4s66\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.870734 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88r4l\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-kube-api-access-88r4l\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.875309 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.876577 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.376557275 +0000 UTC m=+144.638444894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.888361 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbz56\" (UniqueName: \"kubernetes.io/projected/90244cab-89b7-4109-b673-a7cd881ae0a4-kube-api-access-sbz56\") pod \"openshift-config-operator-7777fb866f-ppm2p\" (UID: \"90244cab-89b7-4109-b673-a7cd881ae0a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.895969 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.903046 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.903385 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx4wm\" (UniqueName: \"kubernetes.io/projected/4813f43d-a284-4253-a8ec-ecc3c7a0ae84-kube-api-access-rx4wm\") pod \"kube-storage-version-migrator-operator-b67b599dd-6rwn9\" (UID: \"4813f43d-a284-4253-a8ec-ecc3c7a0ae84\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.910743 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.933192 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-66bwg" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.936706 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9zmd\" (UniqueName: \"kubernetes.io/projected/51cf8011-671e-401b-ba7f-b062fa607e7f-kube-api-access-m9zmd\") pod \"etcd-operator-b45778765-lvrjn\" (UID: \"51cf8011-671e-401b-ba7f-b062fa607e7f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.938861 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqlns\" (UniqueName: \"kubernetes.io/projected/62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e-kube-api-access-tqlns\") pod \"openshift-controller-manager-operator-756b6f6bc6-ccr2b\" (UID: \"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.956332 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.960020 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2thgc\" (UID: \"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.969380 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.979365 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:25 crc kubenswrapper[4789]: E1216 06:53:25.979682 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.479671056 +0000 UTC m=+144.741558685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.979883 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" Dec 16 06:53:25 crc kubenswrapper[4789]: I1216 06:53:25.980550 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr79q\" (UniqueName: \"kubernetes.io/projected/a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f-kube-api-access-hr79q\") pod \"machine-config-controller-84d6567774-8vm2v\" (UID: \"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.004335 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-bound-sa-token\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.023346 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjw9\" (UniqueName: \"kubernetes.io/projected/5221dd3a-57e8-43ff-ac08-62cbfc025419-kube-api-access-fbjw9\") pod \"router-default-5444994796-pwj9t\" (UID: \"5221dd3a-57e8-43ff-ac08-62cbfc025419\") " pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.041494 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjl5l\" (UniqueName: \"kubernetes.io/projected/41ef34b6-bb16-4602-a6b5-40597c0dc211-kube-api-access-jjl5l\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.058805 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.069986 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r6cr\" (UniqueName: \"kubernetes.io/projected/c86e6908-9ec3-4e62-b9cf-86f136b1dc6a-kube-api-access-5r6cr\") pod \"console-operator-58897d9998-s57cr\" (UID: \"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a\") " pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.080935 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.081187 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgbhs\" (UniqueName: \"kubernetes.io/projected/aebb0644-d181-4a38-94d6-5885ca2058ee-kube-api-access-jgbhs\") pod \"ingress-canary-ch664\" (UID: \"aebb0644-d181-4a38-94d6-5885ca2058ee\") " pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.081292 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.581272401 +0000 UTC m=+144.843160130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.081435 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.081809 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.581790884 +0000 UTC m=+144.843678513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.111735 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.118001 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.127566 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq5m8\" (UniqueName: \"kubernetes.io/projected/84194086-c0d5-40d8-930d-a83c50b7dd3f-kube-api-access-pq5m8\") pod \"migrator-59844c95c7-thjv9\" (UID: \"84194086-c0d5-40d8-930d-a83c50b7dd3f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.148071 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5sv\" (UniqueName: \"kubernetes.io/projected/4e6e8c5a-b937-4072-902d-28e056de16d2-kube-api-access-sp5sv\") pod \"collect-profiles-29431125-8wqnv\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.158313 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.170161 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9m89\" (UniqueName: \"kubernetes.io/projected/8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae-kube-api-access-j9m89\") pod \"csi-hostpathplugin-7stjw\" (UID: \"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae\") " pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.182769 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.183516 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.683490611 +0000 UTC m=+144.945378240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.184030 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcr5q\" (UniqueName: \"kubernetes.io/projected/6e51b8b3-6bc6-4462-ae45-eb782f3c27f2-kube-api-access-mcr5q\") pod \"olm-operator-6b444d44fb-7zfdj\" (UID: \"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.195079 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.200163 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ch664" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.205770 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5lc\" (UniqueName: \"kubernetes.io/projected/2cbecadb-0f2a-443e-b065-edc627985d96-kube-api-access-jp5lc\") pod \"controller-manager-879f6c89f-l6ctc\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.224114 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.238603 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/658ac624-7f09-4d74-bd73-8b00a997847f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9mh47\" (UID: \"658ac624-7f09-4d74-bd73-8b00a997847f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.248120 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxlp\" (UniqueName: \"kubernetes.io/projected/ed831a75-86f6-4f98-a6f2-63e45e6f051b-kube-api-access-7cxlp\") pod \"packageserver-d55dfcdfc-grmxx\" (UID: \"ed831a75-86f6-4f98-a6f2-63e45e6f051b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.252902 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fnph9"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.252959 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.260531 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.272686 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpkc8\" (UniqueName: \"kubernetes.io/projected/d5615d08-80d1-4209-9cc4-c9b27e1ac024-kube-api-access-dpkc8\") pod \"package-server-manager-789f6589d5-ptbc9\" (UID: \"d5615d08-80d1-4209-9cc4-c9b27e1ac024\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.284585 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kmn\" (UniqueName: \"kubernetes.io/projected/c27456e8-bb86-45c4-b482-dfc01d73f4b5-kube-api-access-67kmn\") pod \"multus-admission-controller-857f4d67dd-djgqn\" (UID: \"c27456e8-bb86-45c4-b482-dfc01d73f4b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.284582 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.284853 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.784840932 +0000 UTC m=+145.046728561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.290684 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.302608 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2c6c23_962c_4829_bd8a_088c7c63dfa4.slice/crio-a12fe5c8b3e814c14e729471164778f2045df6bfbcf70ce5ee68810a01705cd8 WatchSource:0}: Error finding container a12fe5c8b3e814c14e729471164778f2045df6bfbcf70ce5ee68810a01705cd8: Status 404 returned error can't find the container with id a12fe5c8b3e814c14e729471164778f2045df6bfbcf70ce5ee68810a01705cd8 Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.303067 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnzz6\" (UniqueName: \"kubernetes.io/projected/120a166b-9fed-4921-940d-6c43c0a145c0-kube-api-access-mnzz6\") pod \"catalog-operator-68c6474976-rnfxb\" (UID: \"120a166b-9fed-4921-940d-6c43c0a145c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.306363 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.306860 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.308213 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.326048 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.329777 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltcdg\" (UniqueName: \"kubernetes.io/projected/9b88cb44-5b2f-4838-bddd-7b3b17ebb629-kube-api-access-ltcdg\") pod \"dns-operator-744455d44c-mw7cs\" (UID: \"9b88cb44-5b2f-4838-bddd-7b3b17ebb629\") " pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.337756 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.347289 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.348348 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.348544 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmtr\" (UniqueName: \"kubernetes.io/projected/cc6f123f-a6d4-4451-bdb5-82286e190c55-kube-api-access-hbmtr\") pod \"apiserver-7bbb656c7d-6nzjt\" (UID: \"cc6f123f-a6d4-4451-bdb5-82286e190c55\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.356810 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.369741 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41ef34b6-bb16-4602-a6b5-40597c0dc211-bound-sa-token\") pod \"ingress-operator-5b745b69d9-swk2d\" (UID: \"41ef34b6-bb16-4602-a6b5-40597c0dc211\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.371206 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.379171 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.388594 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.391656 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.891620817 +0000 UTC m=+145.153508476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.391893 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-66bwg"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.392031 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.394269 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5cc4\" (UniqueName: \"kubernetes.io/projected/ee5c72d0-500f-4d04-9a3c-76d815541c0a-kube-api-access-x5cc4\") pod \"service-ca-operator-777779d784-7ntql\" (UID: \"ee5c72d0-500f-4d04-9a3c-76d815541c0a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.403970 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.407477 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkdxq\" (UniqueName: \"kubernetes.io/projected/a8e72ec5-bc56-49fc-99e2-d8f246487fd4-kube-api-access-jkdxq\") pod \"cluster-image-registry-operator-dc59b4c8b-mt5pk\" (UID: \"a8e72ec5-bc56-49fc-99e2-d8f246487fd4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.416371 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s4s66"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.424105 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.433370 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvksw\" (UniqueName: \"kubernetes.io/projected/f651303c-cd90-4d8c-92c4-519a02627eb5-kube-api-access-rvksw\") pod \"service-ca-9c57cc56f-hspwh\" (UID: \"f651303c-cd90-4d8c-92c4-519a02627eb5\") " pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.433592 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.444973 4789 request.go:700] Waited for 1.06611474s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/serviceaccounts/machine-config-operator/token Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.445957 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.446653 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.450076 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6h8\" (UniqueName: \"kubernetes.io/projected/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-kube-api-access-nj6h8\") pod \"marketplace-operator-79b997595-cmw44\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.453835 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.461905 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.488885 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7stjw"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.489785 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.490116 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:26.990104311 +0000 UTC m=+145.251991940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.490758 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtnng\" (UniqueName: \"kubernetes.io/projected/28e992ee-e81f-46d7-b422-27fa3023b7d8-kube-api-access-dtnng\") pod \"console-f9d7485db-cbqb2\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.508400 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa494689_eaaa_455c_ba63_2f6a295d5a27.slice/crio-a79eaaef390904be1879f1e23b16bd62827cc9adcf195ae6fd6b858dcf491905 WatchSource:0}: Error finding container a79eaaef390904be1879f1e23b16bd62827cc9adcf195ae6fd6b858dcf491905: Status 404 returned error can't find the container with id a79eaaef390904be1879f1e23b16bd62827cc9adcf195ae6fd6b858dcf491905 Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.511216 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.514425 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5g4\" (UniqueName: \"kubernetes.io/projected/cce118b7-47b5-499b-9cc6-e5e24ba1c317-kube-api-access-fw5g4\") pod \"machine-config-server-7fvzt\" (UID: \"cce118b7-47b5-499b-9cc6-e5e24ba1c317\") " pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.526380 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-898pl\" (UniqueName: \"kubernetes.io/projected/dbadb44f-ac2d-4056-97c6-1af7fb39e4f1-kube-api-access-898pl\") pod \"dns-default-98ssf\" (UID: \"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1\") " pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.532519 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.545086 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ch664"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.590446 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lvrjn"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.591108 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.591295 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.091272617 +0000 UTC m=+145.353160246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.591751 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.592153 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.092141717 +0000 UTC m=+145.354029376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.598701 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.611569 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s57cr"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.644759 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.692872 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.693129 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.693151 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.193131568 +0000 UTC m=+145.455019197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.693380 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.693745 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.193729252 +0000 UTC m=+145.455616881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.701044 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lphz\" (UniqueName: \"kubernetes.io/projected/8f22acd3-56c4-42b8-badc-1239c0050781-kube-api-access-4lphz\") pod \"machine-config-operator-74547568cd-gkn6m\" (UID: \"8f22acd3-56c4-42b8-badc-1239c0050781\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.708449 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8077458f_c8a9_4cc7_a8b1_4ae376d5e5ae.slice/crio-a67141c700ef75af892cd3843e6011f3d3a3303ff7b78ddfade0bab68d52a6b9 WatchSource:0}: Error finding container a67141c700ef75af892cd3843e6011f3d3a3303ff7b78ddfade0bab68d52a6b9: Status 404 returned error can't find the container with id a67141c700ef75af892cd3843e6011f3d3a3303ff7b78ddfade0bab68d52a6b9 Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.713963 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.715738 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f62aa0_33ed_424a_a0b4_92a3ce4ed97e.slice/crio-480d320c8a02aa144fa9e9db95a467caac2aa437feffcabb544e5ff1b38488e5 WatchSource:0}: Error finding container 480d320c8a02aa144fa9e9db95a467caac2aa437feffcabb544e5ff1b38488e5: Status 404 returned error can't find the container with id 480d320c8a02aa144fa9e9db95a467caac2aa437feffcabb544e5ff1b38488e5 Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.716389 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaebb0644_d181_4a38_94d6_5885ca2058ee.slice/crio-26ef4ec18f9aa3aad965705e3bc729619314fdb7b60db4490c15942b1c2fb469 WatchSource:0}: Error finding container 26ef4ec18f9aa3aad965705e3bc729619314fdb7b60db4490c15942b1c2fb469: Status 404 returned error can't find the container with id 26ef4ec18f9aa3aad965705e3bc729619314fdb7b60db4490c15942b1c2fb469 Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.717771 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4813f43d_a284_4253_a8ec_ecc3c7a0ae84.slice/crio-766bffb497bd79cc7d340991d8c3aad8be36620e556eb20ce7025bfe82de9177 WatchSource:0}: Error finding container 766bffb497bd79cc7d340991d8c3aad8be36620e556eb20ce7025bfe82de9177: Status 404 returned error can't find the container with id 766bffb497bd79cc7d340991d8c3aad8be36620e556eb20ce7025bfe82de9177 Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.718887 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51cf8011_671e_401b_ba7f_b062fa607e7f.slice/crio-cfa80b042cfbff59ee1845dad3c0617d60493a3b83f1f68224f9a3a9265a8ec0 WatchSource:0}: Error finding container cfa80b042cfbff59ee1845dad3c0617d60493a3b83f1f68224f9a3a9265a8ec0: Status 404 returned error can't find the container with id cfa80b042cfbff59ee1845dad3c0617d60493a3b83f1f68224f9a3a9265a8ec0 Dec 16 06:53:26 crc kubenswrapper[4789]: W1216 06:53:26.720324 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86e6908_9ec3_4e62_b9cf_86f136b1dc6a.slice/crio-4dbaa63bcf8f88f522e1d45dd21a3fcc29a1a228a9ee82905e2f6cc38a0afeb0 WatchSource:0}: Error finding container 4dbaa63bcf8f88f522e1d45dd21a3fcc29a1a228a9ee82905e2f6cc38a0afeb0: Status 404 returned error can't find the container with id 4dbaa63bcf8f88f522e1d45dd21a3fcc29a1a228a9ee82905e2f6cc38a0afeb0 Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.769956 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7fvzt" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.780015 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.794285 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.794707 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.294680693 +0000 UTC m=+145.556568322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.842499 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pwj9t" event={"ID":"5221dd3a-57e8-43ff-ac08-62cbfc025419","Type":"ContainerStarted","Data":"75392dfd58000c46ec2a68e49e169186bf2d7a89cb70f65f993d3852e7bb5f4c"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.845813 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" event={"ID":"49e8732a-7f75-4b45-94d7-ad27168422b4","Type":"ContainerStarted","Data":"67335a550776ec0258500c874f9e6bf5bab876d4a89e637b1abcdeae8a4c5e30"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.846679 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" event={"ID":"32383d71-3226-46ea-9d69-c3ab1096ec2c","Type":"ContainerStarted","Data":"f3d6929635ed92e90ef544b574f3b1d74abea5ac59cf6ac483748f3278930807"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.847472 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" event={"ID":"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae","Type":"ContainerStarted","Data":"a67141c700ef75af892cd3843e6011f3d3a3303ff7b78ddfade0bab68d52a6b9"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.848181 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ch664" event={"ID":"aebb0644-d181-4a38-94d6-5885ca2058ee","Type":"ContainerStarted","Data":"26ef4ec18f9aa3aad965705e3bc729619314fdb7b60db4490c15942b1c2fb469"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.853811 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" event={"ID":"4813f43d-a284-4253-a8ec-ecc3c7a0ae84","Type":"ContainerStarted","Data":"766bffb497bd79cc7d340991d8c3aad8be36620e556eb20ce7025bfe82de9177"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.855087 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" event={"ID":"90244cab-89b7-4109-b673-a7cd881ae0a4","Type":"ContainerStarted","Data":"895051a199a59449d0cf5dbe12b2847b3f01332173cbae0f50b7dc52c5025f33"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.855758 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" event={"ID":"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e","Type":"ContainerStarted","Data":"480d320c8a02aa144fa9e9db95a467caac2aa437feffcabb544e5ff1b38488e5"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.856504 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" event={"ID":"51cf8011-671e-401b-ba7f-b062fa607e7f","Type":"ContainerStarted","Data":"cfa80b042cfbff59ee1845dad3c0617d60493a3b83f1f68224f9a3a9265a8ec0"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.857118 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" event={"ID":"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f","Type":"ContainerStarted","Data":"480431cd0629928f804fb35d0be6ce22822b50f4e74a3e6c561ef78dad27d2bb"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.857694 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" event={"ID":"84194086-c0d5-40d8-930d-a83c50b7dd3f","Type":"ContainerStarted","Data":"a17ffcee321e3019cc0ae071bba82dd32c4db1e4f9bfc655035e3db645a81010"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.864276 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" event={"ID":"8b2c6c23-962c-4829-bd8a-088c7c63dfa4","Type":"ContainerStarted","Data":"a12fe5c8b3e814c14e729471164778f2045df6bfbcf70ce5ee68810a01705cd8"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.878485 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-s57cr" event={"ID":"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a","Type":"ContainerStarted","Data":"4dbaa63bcf8f88f522e1d45dd21a3fcc29a1a228a9ee82905e2f6cc38a0afeb0"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.884939 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" event={"ID":"fa338783-6d00-4150-96d3-03ef1f28eb41","Type":"ContainerStarted","Data":"c88a8950a86824164e78cde78fc1d13ade4212fc80f94ce951ab221dbbfd090a"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.889464 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-66bwg" event={"ID":"fa494689-eaaa-455c-ba63-2f6a295d5a27","Type":"ContainerStarted","Data":"a79eaaef390904be1879f1e23b16bd62827cc9adcf195ae6fd6b858dcf491905"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.896089 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.898050 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" event={"ID":"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e","Type":"ContainerStarted","Data":"b29a9a1ba78403fab62cb193e2574866bba0e65f886dfb00f346a68dcfc362ac"} Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.898598 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.398565991 +0000 UTC m=+145.660453630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.900466 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" event={"ID":"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b","Type":"ContainerStarted","Data":"8a7fb95d9ac2ed889165af45ac4b86a1843b364d387ee4ddacfe9530a58b0d99"} Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.915500 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.937967 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9"] Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.998535 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:26 crc kubenswrapper[4789]: E1216 06:53:26.998707 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.498680244 +0000 UTC m=+145.760567883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:26 crc kubenswrapper[4789]: I1216 06:53:26.999280 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.000803 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.500791442 +0000 UTC m=+145.762679081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.032904 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mw7cs"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.104214 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.104764 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.604746441 +0000 UTC m=+145.866634080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.172454 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6ctc"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.194722 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv"] Dec 16 06:53:27 crc kubenswrapper[4789]: W1216 06:53:27.200031 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b88cb44_5b2f_4838_bddd_7b3b17ebb629.slice/crio-7293046874f5142d65040dfadf6c5e8bced3aa9c521a50568667a9f7683fe785 WatchSource:0}: Error finding container 7293046874f5142d65040dfadf6c5e8bced3aa9c521a50568667a9f7683fe785: Status 404 returned error can't find the container with id 7293046874f5142d65040dfadf6c5e8bced3aa9c521a50568667a9f7683fe785 Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.206054 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.206552 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.706532082 +0000 UTC m=+145.968419711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.307374 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.307764 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.807748038 +0000 UTC m=+146.069635657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.379097 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-sg5st" podStartSLOduration=127.379073548 podStartE2EDuration="2m7.379073548s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:27.35651356 +0000 UTC m=+145.618401189" watchObservedRunningTime="2025-12-16 06:53:27.379073548 +0000 UTC m=+145.640961177" Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.411063 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.411596 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:27.911580946 +0000 UTC m=+146.173468575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.523490 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.524126 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.024111703 +0000 UTC m=+146.285999332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.552201 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.628077 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.628606 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.128596276 +0000 UTC m=+146.390483905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.682583 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.703684 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-68bzj" podStartSLOduration=127.703668331 podStartE2EDuration="2m7.703668331s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:27.701344177 +0000 UTC m=+145.963231806" watchObservedRunningTime="2025-12-16 06:53:27.703668331 +0000 UTC m=+145.965555960" Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.715700 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmw44"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.738565 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.739083 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.239047034 +0000 UTC m=+146.500934663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.834326 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.839684 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.840080 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.340068127 +0000 UTC m=+146.601955746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.880965 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.892718 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-djgqn"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.907619 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.909802 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hspwh"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.939512 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" event={"ID":"f1038662-bd55-4e28-bd30-53e66f03ff85","Type":"ContainerStarted","Data":"581d93ea0804c71ccb714e7aea1203f20f220d6883522241ec6ab5dc7b4c34fe"} Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.943294 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:27 crc kubenswrapper[4789]: E1216 06:53:27.943640 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.443620778 +0000 UTC m=+146.705508407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.945360 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" event={"ID":"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b","Type":"ContainerStarted","Data":"01eb85cf7c8025f23a704fb0fb3111decb0621182b844fb9c22307e77df967d8"} Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.946354 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.948156 4789 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-kpcbh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.948197 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.965635 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-cbqb2"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.979964 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47"] Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.989054 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" event={"ID":"9b88cb44-5b2f-4838-bddd-7b3b17ebb629","Type":"ContainerStarted","Data":"7293046874f5142d65040dfadf6c5e8bced3aa9c521a50568667a9f7683fe785"} Dec 16 06:53:27 crc kubenswrapper[4789]: I1216 06:53:27.997739 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7ntql"] Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.003699 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt"] Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.014514 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ch664" event={"ID":"aebb0644-d181-4a38-94d6-5885ca2058ee","Type":"ContainerStarted","Data":"464821ce03591c35ec510ecc2651b596e1452c4723199d95307c32536c71e677"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.040737 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" event={"ID":"62a9e6e3-f3c1-4c12-8e27-6f5d66f4500e","Type":"ContainerStarted","Data":"267efbb63f3061c48a5633da2af942f832caf43a3140ad616a5d3dabeb2a16c5"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.045165 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.045228 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.045307 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.046549 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:28 crc kubenswrapper[4789]: E1216 06:53:28.046770 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.546760119 +0000 UTC m=+146.808647738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.058153 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" event={"ID":"49e8732a-7f75-4b45-94d7-ad27168422b4","Type":"ContainerStarted","Data":"16e871493c4da8adb31621db57f86769bc9c7cb24b5e1cae87063ab01106cd05"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.071598 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.075702 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" event={"ID":"ed831a75-86f6-4f98-a6f2-63e45e6f051b","Type":"ContainerStarted","Data":"3f1f9768781c34e4e377aaf8f31b947be509a57496033feeb4c722adfb09553b"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.097638 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" event={"ID":"120a166b-9fed-4921-940d-6c43c0a145c0","Type":"ContainerStarted","Data":"a9c4e6dd8f1a3c4d1001baff2f198053e9d965d7e120f5d45ebcdb0479cc9aaf"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.123585 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" event={"ID":"84194086-c0d5-40d8-930d-a83c50b7dd3f","Type":"ContainerStarted","Data":"154049aee93962058929a254a30e047ba7dee05001fd4a614040e1553d2dc348"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.123624 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" event={"ID":"8b2c6c23-962c-4829-bd8a-088c7c63dfa4","Type":"ContainerStarted","Data":"b8a9d0d3e80b9eb1d25e82e021daa6013b1772eb6fe82ad920a121d62135cbd6"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.124529 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" event={"ID":"c06d6dec-3c45-42f3-bd57-dece3f5dafe6","Type":"ContainerStarted","Data":"4ca14818d5b2c3f534f778859f8072546b81cd3409ad51e9af603771b07b18a5"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.138607 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" event={"ID":"03d51736-0f2b-4c40-b6f1-ee44fa4312f9","Type":"ContainerStarted","Data":"699850b79f63a71b434d40c2b1039ce98122c0dfb399474b77366ee5246773fa"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.145823 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.145996 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pwj9t" event={"ID":"5221dd3a-57e8-43ff-ac08-62cbfc025419","Type":"ContainerStarted","Data":"e6c5abd8931458993d54e2dc185b6f08a26eaa2c4e994db509c8080bed50f9e9"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.146076 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.146214 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:28 crc kubenswrapper[4789]: E1216 06:53:28.147428 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:28.647409363 +0000 UTC m=+146.909296992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.147703 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-98ssf"] Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.148889 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m"] Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.149559 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.149777 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" event={"ID":"4e6e8c5a-b937-4072-902d-28e056de16d2","Type":"ContainerStarted","Data":"fdb9873985c6b8d0f9f3ef053d6eb8451198c0e54057ebbd360f6969b9d250ed"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.150632 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.151856 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7fvzt" event={"ID":"cce118b7-47b5-499b-9cc6-e5e24ba1c317","Type":"ContainerStarted","Data":"9da065d4d7dc75e8b64cbb266d39e01111531f43f51dd89155b1415bbd1f467a"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.153164 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-s57cr" event={"ID":"c86e6908-9ec3-4e62-b9cf-86f136b1dc6a","Type":"ContainerStarted","Data":"2b93e3f1f744fd684d73e353559af94e247df195b953ec358ef54e95237f511d"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.153889 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.154653 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" event={"ID":"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f","Type":"ContainerStarted","Data":"004b48ce5502fc7eb9211b46f4a0c8b6b0270dbc730720f88df48e0283efd7a5"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.156989 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" event={"ID":"32383d71-3226-46ea-9d69-c3ab1096ec2c","Type":"ContainerStarted","Data":"9e8eac2b1e7f2c1ddbb9be410ad8532c433010e87cb5b6050e74cd1fe2eacaa8"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.157049 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.158670 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" event={"ID":"4813f43d-a284-4253-a8ec-ecc3c7a0ae84","Type":"ContainerStarted","Data":"bbc044d54f83a847400c0fef30cbbe9c2ea8618c60a4ca1991b8e3994a13c061"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.171172 4789 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-s4s66 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.171251 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" podUID="32383d71-3226-46ea-9d69-c3ab1096ec2c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.173031 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" event={"ID":"d5615d08-80d1-4209-9cc4-c9b27e1ac024","Type":"ContainerStarted","Data":"e36299cb03068aafcc6bd89e066c253d680733f755574a3993d494b6be3f5b24"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.173110 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" event={"ID":"d5615d08-80d1-4209-9cc4-c9b27e1ac024","Type":"ContainerStarted","Data":"7de77d65aeb1033dd21e24643c93f61319c9536af4deb63eac378ba33a9f0866"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.176021 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-66bwg" event={"ID":"fa494689-eaaa-455c-ba63-2f6a295d5a27","Type":"ContainerStarted","Data":"3fdd6460c6ca09ea5a90f5dea688666d8dde9e6d7ac63acc2ed11a242ec7022f"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.176070 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-66bwg" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.177222 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" event={"ID":"2cbecadb-0f2a-443e-b065-edc627985d96","Type":"ContainerStarted","Data":"7a29fee741ba4b33ab36257fb079fe0d2e8a25e1eaa33a83c11b7d5c22190099"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.637891 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.639554 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:28 crc kubenswrapper[4789]: E1216 06:53:28.640141 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.14010737 +0000 UTC m=+147.401995029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.640389 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.640437 4789 patch_prober.go:28] interesting pod/downloads-7954f5f757-66bwg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.640521 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-66bwg" podUID="fa494689-eaaa-455c-ba63-2f6a295d5a27" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.641383 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.641393 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.649688 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:28 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:28 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:28 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.649742 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.650036 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" event={"ID":"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f","Type":"ContainerStarted","Data":"81e23d5378a99973d2c9ef93e41f6a16ebdf11f4bf2e8ed3afdc18559039ea62"} Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.659025 4789 generic.go:334] "Generic (PLEG): container finished" podID="90244cab-89b7-4109-b673-a7cd881ae0a4" containerID="155385fe841b4cba2a26ecd5d512b2da4fa8aacf27499bc540cee6a9fe1bb0ae" exitCode=0 Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.660085 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" event={"ID":"90244cab-89b7-4109-b673-a7cd881ae0a4","Type":"ContainerDied","Data":"155385fe841b4cba2a26ecd5d512b2da4fa8aacf27499bc540cee6a9fe1bb0ae"} Dec 16 06:53:28 crc kubenswrapper[4789]: W1216 06:53:28.691110 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbadb44f_ac2d_4056_97c6_1af7fb39e4f1.slice/crio-3936828e03c9cbd055e963133cfb7750c67371b0435d568c89f2840fde87d495 WatchSource:0}: Error finding container 3936828e03c9cbd055e963133cfb7750c67371b0435d568c89f2840fde87d495: Status 404 returned error can't find the container with id 3936828e03c9cbd055e963133cfb7750c67371b0435d568c89f2840fde87d495 Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.696407 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxvc4" podStartSLOduration=127.696382764 podStartE2EDuration="2m7.696382764s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.680896438 +0000 UTC m=+146.942784107" watchObservedRunningTime="2025-12-16 06:53:28.696382764 +0000 UTC m=+146.958270413" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.734014 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6rwn9" podStartSLOduration=127.733998468 podStartE2EDuration="2m7.733998468s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.707478109 +0000 UTC m=+146.969365758" watchObservedRunningTime="2025-12-16 06:53:28.733998468 +0000 UTC m=+146.995886097" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.735824 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ccr2b" podStartSLOduration=127.73581764 podStartE2EDuration="2m7.73581764s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.733353454 +0000 UTC m=+146.995241073" watchObservedRunningTime="2025-12-16 06:53:28.73581764 +0000 UTC m=+146.997705269" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.740704 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:28 crc kubenswrapper[4789]: E1216 06:53:28.742647 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.242609456 +0000 UTC m=+147.504497075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.787026 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ch664" podStartSLOduration=5.787007257 podStartE2EDuration="5.787007257s" podCreationTimestamp="2025-12-16 06:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.759871023 +0000 UTC m=+147.021758652" watchObservedRunningTime="2025-12-16 06:53:28.787007257 +0000 UTC m=+147.048894886" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.787458 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pwj9t" podStartSLOduration=127.787452057 podStartE2EDuration="2m7.787452057s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.786888225 +0000 UTC m=+147.048775874" watchObservedRunningTime="2025-12-16 06:53:28.787452057 +0000 UTC m=+147.049339686" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.795801 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-s57cr" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.814074 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" podStartSLOduration=128.814057259 podStartE2EDuration="2m8.814057259s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.811463339 +0000 UTC m=+147.073350988" watchObservedRunningTime="2025-12-16 06:53:28.814057259 +0000 UTC m=+147.075944888" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.827794 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-s57cr" podStartSLOduration=127.827775585 podStartE2EDuration="2m7.827775585s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.826415033 +0000 UTC m=+147.088302662" watchObservedRunningTime="2025-12-16 06:53:28.827775585 +0000 UTC m=+147.089663214" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.843947 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hx7q5" podStartSLOduration=127.843925776 podStartE2EDuration="2m7.843925776s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.842468132 +0000 UTC m=+147.104355771" watchObservedRunningTime="2025-12-16 06:53:28.843925776 +0000 UTC m=+147.105813405" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.844530 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:28 crc kubenswrapper[4789]: E1216 06:53:28.846145 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.345605805 +0000 UTC m=+147.607493434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.890303 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-66bwg" podStartSLOduration=127.890277792 podStartE2EDuration="2m7.890277792s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.885704066 +0000 UTC m=+147.147591705" watchObservedRunningTime="2025-12-16 06:53:28.890277792 +0000 UTC m=+147.152165431" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.904526 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" podStartSLOduration=127.904507069 podStartE2EDuration="2m7.904507069s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:28.901626133 +0000 UTC m=+147.163513752" watchObservedRunningTime="2025-12-16 06:53:28.904507069 +0000 UTC m=+147.166394698" Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.945566 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:28 crc kubenswrapper[4789]: E1216 06:53:28.945717 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.445691925 +0000 UTC m=+147.707579554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:28 crc kubenswrapper[4789]: I1216 06:53:28.945941 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:28 crc kubenswrapper[4789]: E1216 06:53:28.946260 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.446253039 +0000 UTC m=+147.708140668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.047058 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.047181 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.547148558 +0000 UTC m=+147.809036187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.047512 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.048016 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.548007517 +0000 UTC m=+147.809895146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.152321 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.153141 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.653117594 +0000 UTC m=+147.915005233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.253858 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.254328 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.754311401 +0000 UTC m=+148.016199030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.266561 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:29 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:29 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:29 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.266674 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.354507 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.356039 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.856009368 +0000 UTC m=+148.117897077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.456945 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.457344 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:29.957332118 +0000 UTC m=+148.219219747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.562040 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.563295 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.063279343 +0000 UTC m=+148.325166972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: W1216 06:53:29.592100 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-025271f418ac235604e4d054d472585f45957c01613d0db633ff52964c763754 WatchSource:0}: Error finding container 025271f418ac235604e4d054d472585f45957c01613d0db633ff52964c763754: Status 404 returned error can't find the container with id 025271f418ac235604e4d054d472585f45957c01613d0db633ff52964c763754 Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.663526 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.663833 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.163818915 +0000 UTC m=+148.425706544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.664559 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" event={"ID":"658ac624-7f09-4d74-bd73-8b00a997847f","Type":"ContainerStarted","Data":"1c4e6bdb51dfda51f0de88dd7a5aa73dbeea9d53c6ca6946bbc9c1a58717079a"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.666090 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" event={"ID":"8f22acd3-56c4-42b8-badc-1239c0050781","Type":"ContainerStarted","Data":"3f2050534e03b22c438447f3ff3cbd171fe63dbc4620d256eff842f83375d62e"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.667529 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" event={"ID":"f651303c-cd90-4d8c-92c4-519a02627eb5","Type":"ContainerStarted","Data":"65f11831f822f920db4bd283014b962d1292dccc9a5c7cd56280dc57764ee4b9"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.669708 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-98ssf" event={"ID":"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1","Type":"ContainerStarted","Data":"3936828e03c9cbd055e963133cfb7750c67371b0435d568c89f2840fde87d495"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.670555 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" event={"ID":"ee5c72d0-500f-4d04-9a3c-76d815541c0a","Type":"ContainerStarted","Data":"7e7403226b1362f8546c9891c4d515e30c95747330a135bd05fc6968f6b63898"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.671553 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" event={"ID":"a8e72ec5-bc56-49fc-99e2-d8f246487fd4","Type":"ContainerStarted","Data":"61729c7a866ef92949c879f2785a5cbaa5f300853415537f7a036dd9360ac2f9"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.672266 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"025271f418ac235604e4d054d472585f45957c01613d0db633ff52964c763754"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.674982 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" event={"ID":"c27456e8-bb86-45c4-b482-dfc01d73f4b5","Type":"ContainerStarted","Data":"455f4173256b916d6c7e71fd00879c238e01562f92aad2ea8c40bc620a4b3cc7"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.682151 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" event={"ID":"cc6f123f-a6d4-4451-bdb5-82286e190c55","Type":"ContainerStarted","Data":"95aea1eb7ec70cd639bd31451bef05ee5cbb7aa8bb833d6b5fe87cf41f1f6293"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.684015 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cbqb2" event={"ID":"28e992ee-e81f-46d7-b422-27fa3023b7d8","Type":"ContainerStarted","Data":"1f1bec6865de399183647305ba1c580d76511b4f2703e7b3ecd7c92425ef06ff"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.685147 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" event={"ID":"41ef34b6-bb16-4602-a6b5-40597c0dc211","Type":"ContainerStarted","Data":"3988f265d633c612dc85971e0502019f5ceff213114c9042065bd2ec7b640e4e"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.705356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" event={"ID":"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2","Type":"ContainerStarted","Data":"950ac118029f910f4b89ab6267a0925c4ee158d45a929e3e7545f2283df074ef"} Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.707579 4789 patch_prober.go:28] interesting pod/downloads-7954f5f757-66bwg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.707639 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-66bwg" podUID="fa494689-eaaa-455c-ba63-2f6a295d5a27" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.714649 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.764676 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.764845 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.264825077 +0000 UTC m=+148.526712716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.765156 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.765493 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.265485273 +0000 UTC m=+148.527373002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.866831 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.867260 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.367241112 +0000 UTC m=+148.629128741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.869179 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.869598 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.369581716 +0000 UTC m=+148.631469345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:29 crc kubenswrapper[4789]: I1216 06:53:29.971632 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:29 crc kubenswrapper[4789]: E1216 06:53:29.972086 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.472065852 +0000 UTC m=+148.733953481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.073607 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.073958 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.573945234 +0000 UTC m=+148.835832863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.174607 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.174938 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.674893395 +0000 UTC m=+148.936781024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.175158 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.175477 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.675470339 +0000 UTC m=+148.937357968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.264535 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:30 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:30 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:30 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.264608 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.276129 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.276254 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.776237964 +0000 UTC m=+149.038125593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.276336 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.276604 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.776597643 +0000 UTC m=+149.038485272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.377235 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.377620 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.877600785 +0000 UTC m=+149.139488414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.478857 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.479386 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:30.979359385 +0000 UTC m=+149.241247044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.580183 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.580630 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.080607002 +0000 UTC m=+149.342494631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.581249 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.581695 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.081679727 +0000 UTC m=+149.343567366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.682119 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.682532 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.182508785 +0000 UTC m=+149.444396444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.709416 4789 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-s4s66 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.709494 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" podUID="32383d71-3226-46ea-9d69-c3ab1096ec2c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.713871 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"acd5cce7ca1055f33a09b4012f9558d69770f587030a44baf986acc1e2834c12"} Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.715002 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"de18ed6ec98e1d684c6d009e86f07563aa89e5065bd0041d4c056ff0352fe0ca"} Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.783524 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.784101 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.28408481 +0000 UTC m=+149.545972519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.886025 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.886650 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.386633467 +0000 UTC m=+149.648521096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:30 crc kubenswrapper[4789]: I1216 06:53:30.987659 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:30 crc kubenswrapper[4789]: E1216 06:53:30.988161 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.488150662 +0000 UTC m=+149.750038291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.089062 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.089501 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.589482691 +0000 UTC m=+149.851370320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.191054 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.191535 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.691514557 +0000 UTC m=+149.953402216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.263982 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:31 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:31 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:31 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.264405 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.292119 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.292431 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.792400576 +0000 UTC m=+150.054288205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.292553 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.293353 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.793344028 +0000 UTC m=+150.055231657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.394526 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.394637 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.894610706 +0000 UTC m=+150.156498335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.395411 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.395766 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.895756013 +0000 UTC m=+150.157643642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.496862 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.497395 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:31.997380489 +0000 UTC m=+150.259268108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.598188 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.598702 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.098682148 +0000 UTC m=+150.360569867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.699218 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.699386 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.199356453 +0000 UTC m=+150.461244082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.699439 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.699893 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.199885154 +0000 UTC m=+150.461772783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.800363 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.800631 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.300564359 +0000 UTC m=+150.562451998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.800834 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.801423 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.301407469 +0000 UTC m=+150.563295108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.902621 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.902838 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.402798169 +0000 UTC m=+150.664685798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:31 crc kubenswrapper[4789]: I1216 06:53:31.903125 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:31 crc kubenswrapper[4789]: E1216 06:53:31.903564 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.403543707 +0000 UTC m=+150.665431336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.004152 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.004407 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.504387045 +0000 UTC m=+150.766274684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.004620 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.005028 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.505009269 +0000 UTC m=+150.766896958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.105503 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.105784 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.605755795 +0000 UTC m=+150.867643424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.105995 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.106314 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.606302619 +0000 UTC m=+150.868190248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.210088 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.210358 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.71034554 +0000 UTC m=+150.972233169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.265124 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:32 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:32 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:32 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.265176 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.312866 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.313386 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.813372768 +0000 UTC m=+151.075260397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.413936 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.414300 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:32.914275709 +0000 UTC m=+151.176163338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.515013 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.515393 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.015371433 +0000 UTC m=+151.277259142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.616535 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.616735 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.116708542 +0000 UTC m=+151.378596171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.718015 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.718410 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.21839077 +0000 UTC m=+151.480278479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.726605 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" event={"ID":"2cbecadb-0f2a-443e-b065-edc627985d96","Type":"ContainerStarted","Data":"8761244a5e8a9f5eddfc512b06f82d2e14b47656c096730a7fa70012c80fe510"} Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.728157 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" event={"ID":"c4f62aa0-33ed-424a-a0b4-92a3ce4ed97e","Type":"ContainerStarted","Data":"c595d7ecfdc89f3047af2720e0c7da06134b9fe2202a69634fb6e65aca23b103"} Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.729720 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" event={"ID":"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f","Type":"ContainerStarted","Data":"5aefc695ecd5e83cc8d3dbec3d1c1f80da838062ac4e17fc160fbaa35665f9ab"} Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.731181 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" event={"ID":"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae","Type":"ContainerStarted","Data":"e15c7fb78413d38bf34c4b9094753ca8e5376db0c9db202ba5860b764980ca37"} Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.819264 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.819472 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.319445484 +0000 UTC m=+151.581333113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.819820 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.820142 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.320134729 +0000 UTC m=+151.582022358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.920590 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.920776 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.420750212 +0000 UTC m=+151.682637841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:32 crc kubenswrapper[4789]: I1216 06:53:32.921062 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:32 crc kubenswrapper[4789]: E1216 06:53:32.921375 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.421364517 +0000 UTC m=+151.683252146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.022795 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.023069 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.523034474 +0000 UTC m=+151.784922113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.023166 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.023687 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.523665859 +0000 UTC m=+151.785553688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.124456 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.124750 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.624736532 +0000 UTC m=+151.886624161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.225840 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.226176 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.726160744 +0000 UTC m=+151.988048373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.271543 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:33 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:33 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:33 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.271610 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.326714 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.326949 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.82690582 +0000 UTC m=+152.088793449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.327191 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.327527 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.827514684 +0000 UTC m=+152.089402313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.418666 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.419336 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.421433 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.421445 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.427658 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.427740 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.927724727 +0000 UTC m=+152.189612356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.427881 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.428223 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:33.928209229 +0000 UTC m=+152.190096858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.468673 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.528494 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.528728 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.528764 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.528900 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.028882374 +0000 UTC m=+152.290770003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.629697 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.629958 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.630023 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.630118 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.630552 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.13053533 +0000 UTC m=+152.392422959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.654036 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.731207 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.731452 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.231415289 +0000 UTC m=+152.493302918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.731506 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.731796 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.731988 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.231950182 +0000 UTC m=+152.493837811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.735746 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" event={"ID":"51cf8011-671e-401b-ba7f-b062fa607e7f","Type":"ContainerStarted","Data":"16d7f0c26c9db02361ea4b095d1b6bb35c8b883ae1886cef5611a0d732658970"} Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.736868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" event={"ID":"4e6e8c5a-b937-4072-902d-28e056de16d2","Type":"ContainerStarted","Data":"c8e96bffcf0ba6f28aabbf35de857a380e1bf23c8d8f14b3997302cf29b1bf22"} Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.832791 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.833159 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.333140928 +0000 UTC m=+152.595028557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:33 crc kubenswrapper[4789]: I1216 06:53:33.933977 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:33 crc kubenswrapper[4789]: E1216 06:53:33.934347 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.434330614 +0000 UTC m=+152.696218243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.035454 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.035837 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.535822718 +0000 UTC m=+152.797710347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.043325 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.136804 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.137299 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.63728714 +0000 UTC m=+152.899174769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.239987 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.240193 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.740161856 +0000 UTC m=+153.002049485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.240790 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.241220 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.741209879 +0000 UTC m=+153.003097508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.270356 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:34 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:34 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:34 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.270443 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.342518 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.343352 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.843330067 +0000 UTC m=+153.105217696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.443971 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.444411 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:34.9443911 +0000 UTC m=+153.206278809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.544847 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.545131 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.045115446 +0000 UTC m=+153.307003075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.646777 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.647166 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.147150942 +0000 UTC m=+153.409038571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.741007 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4a756c0f-e2a4-46d1-837a-a6f9ed694f73","Type":"ContainerStarted","Data":"2000f459ad553be4ce83501d21b530eddc8cd9847c97a4f76ac80906a2e6f272"} Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.742090 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" event={"ID":"41ef34b6-bb16-4602-a6b5-40597c0dc211","Type":"ContainerStarted","Data":"c86d6536ac3258396440babc85952bf311143ca7aa6a5fb9e75f943739ef93e7"} Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.747711 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.747877 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.247859827 +0000 UTC m=+153.509747456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.747987 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.748298 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.248289728 +0000 UTC m=+153.510177357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.849496 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.849809 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.349787631 +0000 UTC m=+153.611675280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:34 crc kubenswrapper[4789]: I1216 06:53:34.950832 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:34 crc kubenswrapper[4789]: E1216 06:53:34.951320 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.451295084 +0000 UTC m=+153.713182743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.052244 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.052434 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.552396389 +0000 UTC m=+153.814284058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.052474 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.052776 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.552763078 +0000 UTC m=+153.814650707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.153158 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.153324 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.653294299 +0000 UTC m=+153.915181928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.153419 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.153737 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.653722778 +0000 UTC m=+153.915610407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.254646 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.255139 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.755117989 +0000 UTC m=+154.017005628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.263606 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:35 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:35 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:35 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.263661 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.356569 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.356852 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.856839468 +0000 UTC m=+154.118727097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.457729 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.457981 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.957902821 +0000 UTC m=+154.219790490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.458303 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.458693 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:35.95867645 +0000 UTC m=+154.220564109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.560061 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.560309 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.060275475 +0000 UTC m=+154.322163104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.560468 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.560889 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.060879059 +0000 UTC m=+154.322766698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.661906 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.662089 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.162055415 +0000 UTC m=+154.423943044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.662282 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.662650 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.162636749 +0000 UTC m=+154.424524468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.763399 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.763563 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.263532958 +0000 UTC m=+154.525420607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.763620 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.763888 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.263877266 +0000 UTC m=+154.525764895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.864694 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.864873 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.364844527 +0000 UTC m=+154.626732166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.865007 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.865365 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.365353379 +0000 UTC m=+154.627241028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.908307 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.934553 4789 patch_prober.go:28] interesting pod/downloads-7954f5f757-66bwg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.934598 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-66bwg" podUID="fa494689-eaaa-455c-ba63-2f6a295d5a27" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.934634 4789 patch_prober.go:28] interesting pod/downloads-7954f5f757-66bwg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.934689 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-66bwg" podUID="fa494689-eaaa-455c-ba63-2f6a295d5a27" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.966047 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.966174 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.466149556 +0000 UTC m=+154.728037185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:35 crc kubenswrapper[4789]: I1216 06:53:35.966332 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:35 crc kubenswrapper[4789]: E1216 06:53:35.967499 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.467484777 +0000 UTC m=+154.729372456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.068079 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.068457 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.568441628 +0000 UTC m=+154.830329257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.169803 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.170205 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.670188437 +0000 UTC m=+154.932076066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.260758 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.262120 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:36 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:36 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:36 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.262175 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.270492 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.270621 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.770604836 +0000 UTC m=+155.032492465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.270768 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.271059 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.771049836 +0000 UTC m=+155.032937465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.371389 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.371608 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.871583567 +0000 UTC m=+155.133471196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.371685 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.371990 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.871981956 +0000 UTC m=+155.133869585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.472704 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.472874 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.972842735 +0000 UTC m=+155.234730364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.473058 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.473344 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:36.973337327 +0000 UTC m=+155.235224956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.574147 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.574363 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.074313478 +0000 UTC m=+155.336201107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.574646 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.575001 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.074993233 +0000 UTC m=+155.336880852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.675504 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.675814 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.175793631 +0000 UTC m=+155.437681280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.777163 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.777488 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.277476409 +0000 UTC m=+155.539364038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.878094 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.883553 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.383517837 +0000 UTC m=+155.645405466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:36 crc kubenswrapper[4789]: I1216 06:53:36.981063 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:36 crc kubenswrapper[4789]: E1216 06:53:36.981438 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.481422257 +0000 UTC m=+155.743309886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.082536 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.082682 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.582654435 +0000 UTC m=+155.844542064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.082723 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.083054 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.583043274 +0000 UTC m=+155.844930903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.184206 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.184333 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.684304071 +0000 UTC m=+155.946191690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.184797 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.185131 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.68512011 +0000 UTC m=+155.947007739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.265124 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:37 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:37 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:37 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.265174 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.285856 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.286165 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.786151563 +0000 UTC m=+156.048039192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.387139 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.387522 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.887507413 +0000 UTC m=+156.149395042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.487740 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.488127 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:37.988109496 +0000 UTC m=+156.249997115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.589801 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.590123 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.090107571 +0000 UTC m=+156.351995200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.691469 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.691741 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.191714137 +0000 UTC m=+156.453601766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.692076 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.692403 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.192388742 +0000 UTC m=+156.454276371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.762157 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7270f997297540c297c6ed9a528031a66391506ce521bd513aa115d32bab7c36"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.764422 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" event={"ID":"120a166b-9fed-4921-940d-6c43c0a145c0","Type":"ContainerStarted","Data":"fb030ed4f9b674c57cc8553f9a9305c56e2ce76afc30353cc0f9f582dad0faff"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.766487 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" event={"ID":"8b2c6c23-962c-4829-bd8a-088c7c63dfa4","Type":"ContainerStarted","Data":"e77c04cf3d1b1a6b489bf343d3943139cb97245f33abf36fb0b9b037b2adf6fe"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.768024 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" event={"ID":"8f22acd3-56c4-42b8-badc-1239c0050781","Type":"ContainerStarted","Data":"186c03f71b1cd42564adaca1da82ce0741011ba09f997aec46b49e1d9b66b895"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.770138 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" event={"ID":"ee5c72d0-500f-4d04-9a3c-76d815541c0a","Type":"ContainerStarted","Data":"d2eb845b140104e2e9bdc331725f295efb1d57cd9af22446aad5e4c189a4b66f"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.771470 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" event={"ID":"a8e72ec5-bc56-49fc-99e2-d8f246487fd4","Type":"ContainerStarted","Data":"c3f7dd5d4042e6d8f30151ee412930eea703d886420c9ab80824c4d1f13bb6b6"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.772808 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" event={"ID":"ed831a75-86f6-4f98-a6f2-63e45e6f051b","Type":"ContainerStarted","Data":"67e06aaadf7b439aecb88926e5359843c6a9b1c079076570fea6d9bfd47303c2"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.774783 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" event={"ID":"90244cab-89b7-4109-b673-a7cd881ae0a4","Type":"ContainerStarted","Data":"e420fd851714938c45a36067b65695ffd7b9dc52b286281cd0721c12217152a7"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.775970 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" event={"ID":"cc6f123f-a6d4-4451-bdb5-82286e190c55","Type":"ContainerStarted","Data":"943ae486d4d87d2d3ee0fead9cb3bdbd5edd22826b1d5756119f036b0cd2f2c4"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.777192 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" event={"ID":"658ac624-7f09-4d74-bd73-8b00a997847f","Type":"ContainerStarted","Data":"11160bc7dc6f7db0876e57fa0a42fe63586c49022c681b33a53ebe4f7e80fe10"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.778736 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" event={"ID":"d5615d08-80d1-4209-9cc4-c9b27e1ac024","Type":"ContainerStarted","Data":"6e0a41faf3a9aa64f4ce3cd7317958f465922cdb9233e39ea5c88a80f5045bc8"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.780294 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" event={"ID":"03d51736-0f2b-4c40-b6f1-ee44fa4312f9","Type":"ContainerStarted","Data":"44c5f1b7d671e9f5747dc3175b4a2f29c9eabc568087d64d8da5d40c5895913b"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.782004 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" event={"ID":"a4eda00f-7a9e-4c1f-98e4-bd844d9cef9f","Type":"ContainerStarted","Data":"3eae305da337c7a43383254b7b27001266599a7671f5118e4928be852339cb10"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.783551 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" event={"ID":"c27456e8-bb86-45c4-b482-dfc01d73f4b5","Type":"ContainerStarted","Data":"a9b8a10ecd1cb294c1f41007429e89be734d98367819d91fd093697a53ca14eb"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.785183 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" event={"ID":"6e51b8b3-6bc6-4462-ae45-eb782f3c27f2","Type":"ContainerStarted","Data":"93e59b093b7f8cb7db2e16b7dc1546de74b8639ca34996667ef508f8713c9603"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.786806 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cbqb2" event={"ID":"28e992ee-e81f-46d7-b422-27fa3023b7d8","Type":"ContainerStarted","Data":"a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.788549 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" event={"ID":"84194086-c0d5-40d8-930d-a83c50b7dd3f","Type":"ContainerStarted","Data":"52fa7366cdffc70c8b44ab75a93a89777dabd36269cddc9f034993e51b8c1682"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.789861 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" event={"ID":"f651303c-cd90-4d8c-92c4-519a02627eb5","Type":"ContainerStarted","Data":"c724b5b664903869a463c432849aa5aae3f1f8c58bf77050d24625665b802529"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.791132 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6673d1074f9efecae01d7d78abdaf854591df03f1edf27fed6f8283cdd131c76"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.792739 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.792876 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.292846722 +0000 UTC m=+156.554734351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.793176 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.793263 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" event={"ID":"c06d6dec-3c45-42f3-bd57-dece3f5dafe6","Type":"ContainerStarted","Data":"56ae5ee98b507625c96eec58dd77e36a5cab61519ececa35b43dfbd19292545b"} Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.793571 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.293555348 +0000 UTC m=+156.555442977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.794887 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" event={"ID":"9b88cb44-5b2f-4838-bddd-7b3b17ebb629","Type":"ContainerStarted","Data":"b53e7cf2912689a182b079b27cd280b44f892ec513c0ea841833f89261732a92"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.797737 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" event={"ID":"f1038662-bd55-4e28-bd30-53e66f03ff85","Type":"ContainerStarted","Data":"e26e91f3b9242b6866018b1b324c006b8d09f718febceaf92a2ae909bc9c4ff3"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.799810 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7fvzt" event={"ID":"cce118b7-47b5-499b-9cc6-e5e24ba1c317","Type":"ContainerStarted","Data":"16c028089006bf8d1cd22170fdd4e0248f4f1cc27c153f6e41733dc03bf2f11d"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.801786 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-98ssf" event={"ID":"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1","Type":"ContainerStarted","Data":"e2d494e0f08dc469939d9b53b08ff566e6854e68dc07c005d62471b25d957481"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.803466 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"81470cc6e49503b94272bce5de88cd0cca9e068a87f1337ee5896f097787517b"} Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.819278 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2thgc" podStartSLOduration=136.81925908 podStartE2EDuration="2m16.81925908s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:37.81882234 +0000 UTC m=+156.080709969" watchObservedRunningTime="2025-12-16 06:53:37.81925908 +0000 UTC m=+156.081146709" Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.847514 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lvrjn" podStartSLOduration=136.847485018 podStartE2EDuration="2m16.847485018s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:37.843385024 +0000 UTC m=+156.105272663" watchObservedRunningTime="2025-12-16 06:53:37.847485018 +0000 UTC m=+156.109372647" Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.861277 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" podStartSLOduration=137.861257285 podStartE2EDuration="2m17.861257285s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:37.859945015 +0000 UTC m=+156.121832654" watchObservedRunningTime="2025-12-16 06:53:37.861257285 +0000 UTC m=+156.123144924" Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.896657 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.898062 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.398045711 +0000 UTC m=+156.659933340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:37 crc kubenswrapper[4789]: I1216 06:53:37.998485 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:37 crc kubenswrapper[4789]: E1216 06:53:37.999590 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.499572825 +0000 UTC m=+156.761460454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.100319 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.100608 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.600530916 +0000 UTC m=+156.862418535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.100969 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.101571 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.601563989 +0000 UTC m=+156.863451618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.202450 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.202645 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.702618033 +0000 UTC m=+156.964505662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.202752 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.203100 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.703088143 +0000 UTC m=+156.964975772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.265553 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:38 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:38 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:38 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.265613 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.304297 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.304469 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.804446283 +0000 UTC m=+157.066333912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.304561 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.304857 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.804849733 +0000 UTC m=+157.066737362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.405907 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.406045 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.906020869 +0000 UTC m=+157.167908498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.406157 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.406462 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:38.906454928 +0000 UTC m=+157.168342547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.507452 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.507659 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.007632714 +0000 UTC m=+157.269520343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.507832 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.508179 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.008166077 +0000 UTC m=+157.270053706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.609486 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.609704 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.10967636 +0000 UTC m=+157.371563989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.609936 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.610244 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.110235704 +0000 UTC m=+157.372123323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.711291 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.711510 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.211480681 +0000 UTC m=+157.473368310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.811532 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" event={"ID":"c27456e8-bb86-45c4-b482-dfc01d73f4b5","Type":"ContainerStarted","Data":"944821ba6688a02a04939ef7d0723752cd14bdb7c3d4e64af43d9ec077ef75ce"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.812435 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.812739 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.312723788 +0000 UTC m=+157.574611417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.813547 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" event={"ID":"8f22acd3-56c4-42b8-badc-1239c0050781","Type":"ContainerStarted","Data":"da9f31d59661e267e7b0825966d513155ade5102a28b03392c5479e78ae4caa9"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.818337 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" event={"ID":"41ef34b6-bb16-4602-a6b5-40597c0dc211","Type":"ContainerStarted","Data":"6737c37f0fb94cb2878c0f9ce2e80304464a3ca55846b791bbaf1d180f74bb77"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.820089 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" event={"ID":"d01cea9e-7204-4eb9-8cf0-d6d3b82eea5f","Type":"ContainerStarted","Data":"257003218ed6c58efed805c8d9de0fc6287e667c50dca0b3a253d4d21a6ac851"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.822260 4789 generic.go:334] "Generic (PLEG): container finished" podID="4e6e8c5a-b937-4072-902d-28e056de16d2" containerID="c8e96bffcf0ba6f28aabbf35de857a380e1bf23c8d8f14b3997302cf29b1bf22" exitCode=0 Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.822299 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" event={"ID":"4e6e8c5a-b937-4072-902d-28e056de16d2","Type":"ContainerDied","Data":"c8e96bffcf0ba6f28aabbf35de857a380e1bf23c8d8f14b3997302cf29b1bf22"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.824189 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-98ssf" event={"ID":"dbadb44f-ac2d-4056-97c6-1af7fb39e4f1","Type":"ContainerStarted","Data":"48a9b4ff655eb7b396c11bcdea01dbcfd4a6491f3c6d521683b3e5ce0afb0c97"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.824309 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.828233 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" event={"ID":"9b88cb44-5b2f-4838-bddd-7b3b17ebb629","Type":"ContainerStarted","Data":"72fe4b79732d0b35bc045d3ea830b38c9e2916a10791117b46ac4dbdb6b6f7d9"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.832284 4789 generic.go:334] "Generic (PLEG): container finished" podID="cc6f123f-a6d4-4451-bdb5-82286e190c55" containerID="943ae486d4d87d2d3ee0fead9cb3bdbd5edd22826b1d5756119f036b0cd2f2c4" exitCode=0 Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.832781 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" event={"ID":"cc6f123f-a6d4-4451-bdb5-82286e190c55","Type":"ContainerDied","Data":"943ae486d4d87d2d3ee0fead9cb3bdbd5edd22826b1d5756119f036b0cd2f2c4"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.836424 4789 generic.go:334] "Generic (PLEG): container finished" podID="4a756c0f-e2a4-46d1-837a-a6f9ed694f73" containerID="25ec2d8671461a0a730a26e644598fa1f0d8fab490748a4ad7184229b44f8367" exitCode=0 Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.837066 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4a756c0f-e2a4-46d1-837a-a6f9ed694f73","Type":"ContainerDied","Data":"25ec2d8671461a0a730a26e644598fa1f0d8fab490748a4ad7184229b44f8367"} Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.837100 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.838666 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" podStartSLOduration=137.838652595 podStartE2EDuration="2m17.838652595s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:37.899514234 +0000 UTC m=+156.161401863" watchObservedRunningTime="2025-12-16 06:53:38.838652595 +0000 UTC m=+157.100540224" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854075 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854132 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854150 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854176 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854193 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854207 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854623 4789 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7zfdj container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.854669 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" podUID="6e51b8b3-6bc6-4462-ae45-eb782f3c27f2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.855038 4789 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cmw44 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.855080 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.855571 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gkn6m" podStartSLOduration=137.855557214 podStartE2EDuration="2m17.855557214s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:38.837538189 +0000 UTC m=+157.099425818" watchObservedRunningTime="2025-12-16 06:53:38.855557214 +0000 UTC m=+157.117444853" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.870471 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.913420 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-swk2d" podStartSLOduration=137.913387073 podStartE2EDuration="2m17.913387073s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:38.890118518 +0000 UTC m=+157.152006147" watchObservedRunningTime="2025-12-16 06:53:38.913387073 +0000 UTC m=+157.175274702" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.914893 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:38 crc kubenswrapper[4789]: E1216 06:53:38.944870 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.418301686 +0000 UTC m=+157.680189335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.946756 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" podStartSLOduration=137.9467374 podStartE2EDuration="2m17.9467374s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:38.922167705 +0000 UTC m=+157.184055334" watchObservedRunningTime="2025-12-16 06:53:38.9467374 +0000 UTC m=+157.208625029" Dec 16 06:53:38 crc kubenswrapper[4789]: I1216 06:53:38.981990 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" podStartSLOduration=138.98196449900001 podStartE2EDuration="2m18.981964499s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:38.967992778 +0000 UTC m=+157.229880407" watchObservedRunningTime="2025-12-16 06:53:38.981964499 +0000 UTC m=+157.243852138" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.059388 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.060023 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.560009974 +0000 UTC m=+157.821897603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.117434 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mt5pk" podStartSLOduration=138.117420044 podStartE2EDuration="2m18.117420044s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.115974281 +0000 UTC m=+157.377861910" watchObservedRunningTime="2025-12-16 06:53:39.117420044 +0000 UTC m=+157.379307673" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.118483 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jvs9x" podStartSLOduration=139.118478768 podStartE2EDuration="2m19.118478768s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.088572 +0000 UTC m=+157.350459629" watchObservedRunningTime="2025-12-16 06:53:39.118478768 +0000 UTC m=+157.380366397" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.161312 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mw7cs" podStartSLOduration=138.161285502 podStartE2EDuration="2m18.161285502s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.160244629 +0000 UTC m=+157.422132258" watchObservedRunningTime="2025-12-16 06:53:39.161285502 +0000 UTC m=+157.423173131" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.161837 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.162159 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.662143372 +0000 UTC m=+157.924031001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.226251 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-98ssf" podStartSLOduration=16.226230496 podStartE2EDuration="16.226230496s" podCreationTimestamp="2025-12-16 06:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.22556301 +0000 UTC m=+157.487450639" watchObservedRunningTime="2025-12-16 06:53:39.226230496 +0000 UTC m=+157.488118135" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.265316 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.265716 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.765699383 +0000 UTC m=+158.027587022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.266367 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:39 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:39 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:39 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.266431 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.275933 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fnph9" podStartSLOduration=138.275889507 podStartE2EDuration="2m18.275889507s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.274087656 +0000 UTC m=+157.535975285" watchObservedRunningTime="2025-12-16 06:53:39.275889507 +0000 UTC m=+157.537777146" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.330568 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-thjv9" podStartSLOduration=138.330551204 podStartE2EDuration="2m18.330551204s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.329230333 +0000 UTC m=+157.591117962" watchObservedRunningTime="2025-12-16 06:53:39.330551204 +0000 UTC m=+157.592438833" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.331312 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-cbqb2" podStartSLOduration=138.331306341 podStartE2EDuration="2m18.331306341s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.30689998 +0000 UTC m=+157.568787609" watchObservedRunningTime="2025-12-16 06:53:39.331306341 +0000 UTC m=+157.593193970" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.365831 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.366184 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.866153903 +0000 UTC m=+158.128041542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.374438 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" podStartSLOduration=138.374417703 podStartE2EDuration="2m18.374417703s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.371892334 +0000 UTC m=+157.633779963" watchObservedRunningTime="2025-12-16 06:53:39.374417703 +0000 UTC m=+157.636305342" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.375015 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" podStartSLOduration=138.375008076 podStartE2EDuration="2m18.375008076s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.354516824 +0000 UTC m=+157.616404473" watchObservedRunningTime="2025-12-16 06:53:39.375008076 +0000 UTC m=+157.636895705" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.416513 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rnfxb" podStartSLOduration=138.41649177 podStartE2EDuration="2m18.41649177s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.414053904 +0000 UTC m=+157.675941563" watchObservedRunningTime="2025-12-16 06:53:39.41649177 +0000 UTC m=+157.678379399" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.467038 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.467364 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:39.967354159 +0000 UTC m=+158.229241788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.532032 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.532771 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.554135 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hspwh" podStartSLOduration=138.554115344 podStartE2EDuration="2m18.554115344s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.553683714 +0000 UTC m=+157.815571353" watchObservedRunningTime="2025-12-16 06:53:39.554115344 +0000 UTC m=+157.816002973" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.556752 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8vm2v" podStartSLOduration=138.556737424 podStartE2EDuration="2m18.556737424s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.506160671 +0000 UTC m=+157.768048310" watchObservedRunningTime="2025-12-16 06:53:39.556737424 +0000 UTC m=+157.818625063" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.562942 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.569411 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.569627 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.06960212 +0000 UTC m=+158.331489769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.569840 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.570210 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.070198423 +0000 UTC m=+158.332086052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.628820 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7ntql" podStartSLOduration=138.62879715 podStartE2EDuration="2m18.62879715s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.58874499 +0000 UTC m=+157.850632619" watchObservedRunningTime="2025-12-16 06:53:39.62879715 +0000 UTC m=+157.890684779" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.630222 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9mh47" podStartSLOduration=138.630213833 podStartE2EDuration="2m18.630213833s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.630196442 +0000 UTC m=+157.892084071" watchObservedRunningTime="2025-12-16 06:53:39.630213833 +0000 UTC m=+157.892101462" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.672533 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.672994 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.172972286 +0000 UTC m=+158.434859915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.710841 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7fvzt" podStartSLOduration=16.710823396 podStartE2EDuration="16.710823396s" podCreationTimestamp="2025-12-16 06:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.670742325 +0000 UTC m=+157.932629974" watchObservedRunningTime="2025-12-16 06:53:39.710823396 +0000 UTC m=+157.972711025" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.717296 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-grmxx" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.774496 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.774937 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.27490545 +0000 UTC m=+158.536793079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.788684 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" podStartSLOduration=139.788663455 podStartE2EDuration="2m19.788663455s" podCreationTimestamp="2025-12-16 06:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.784644624 +0000 UTC m=+158.046532253" watchObservedRunningTime="2025-12-16 06:53:39.788663455 +0000 UTC m=+158.050551094" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.806489 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g8md7" podStartSLOduration=140.806467565 podStartE2EDuration="2m20.806467565s" podCreationTimestamp="2025-12-16 06:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.806104207 +0000 UTC m=+158.067991836" watchObservedRunningTime="2025-12-16 06:53:39.806467565 +0000 UTC m=+158.068355204" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.845720 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" podStartSLOduration=138.845705057 podStartE2EDuration="2m18.845705057s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.844711864 +0000 UTC m=+158.106599493" watchObservedRunningTime="2025-12-16 06:53:39.845705057 +0000 UTC m=+158.107592686" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.859126 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" event={"ID":"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae","Type":"ContainerStarted","Data":"638e3a7feb2bab535d9e22c695c844c41fc09046842ef8105661d002a71c1bad"} Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.861598 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" event={"ID":"cc6f123f-a6d4-4451-bdb5-82286e190c55","Type":"ContainerStarted","Data":"2c931ab70bab1a661e9e721d5fbd2cf744760f112aed043e16248d46e174e30a"} Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.865061 4789 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cmw44 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.865119 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.875146 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-slw4b" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.886021 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.886525 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.386509225 +0000 UTC m=+158.648396854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.976234 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7zfdj" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.989651 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:39 crc kubenswrapper[4789]: I1216 06:53:39.990170 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" podStartSLOduration=138.990153518 podStartE2EDuration="2m18.990153518s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:39.949499373 +0000 UTC m=+158.211387012" watchObservedRunningTime="2025-12-16 06:53:39.990153518 +0000 UTC m=+158.252041147" Dec 16 06:53:39 crc kubenswrapper[4789]: E1216 06:53:39.993135 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.493121736 +0000 UTC m=+158.755009365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.021581 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-djgqn" podStartSLOduration=139.021565241 podStartE2EDuration="2m19.021565241s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:40.021208413 +0000 UTC m=+158.283096042" watchObservedRunningTime="2025-12-16 06:53:40.021565241 +0000 UTC m=+158.283452870" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.094736 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.095154 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.595135181 +0000 UTC m=+158.857022810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.140075 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5r7k7"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.140884 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5r7k7"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.140971 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.143999 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.165994 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.198592 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bblh\" (UniqueName: \"kubernetes.io/projected/9af33cc0-7e86-482a-b3a1-89df07600676-kube-api-access-5bblh\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.198650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-utilities\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.198682 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-catalog-content\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.198728 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.198986 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.698974519 +0000 UTC m=+158.960862148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.253244 4789 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.287642 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:40 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:40 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:40 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.287931 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.301656 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.302504 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.802479358 +0000 UTC m=+159.064366987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.302629 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-utilities\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.301928 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-utilities\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.302691 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-catalog-content\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.302743 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.302811 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bblh\" (UniqueName: \"kubernetes.io/projected/9af33cc0-7e86-482a-b3a1-89df07600676-kube-api-access-5bblh\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.303302 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-catalog-content\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.303519 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.803508452 +0000 UTC m=+159.065396081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.315965 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-btjcd"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.321981 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.322549 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.328165 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.328231 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.331336 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btjcd"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.331463 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.334177 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.345715 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bblh\" (UniqueName: \"kubernetes.io/projected/9af33cc0-7e86-482a-b3a1-89df07600676-kube-api-access-5bblh\") pod \"certified-operators-5r7k7\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.404792 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.405099 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.405175 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8gcv\" (UniqueName: \"kubernetes.io/projected/8a620056-2e2e-46ae-9a32-c8aea4b297c4-kube-api-access-m8gcv\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.405212 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-catalog-content\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.405242 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-utilities\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.405281 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.405400 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:40.905379154 +0000 UTC m=+159.167266783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.416783 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.473524 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.489482 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.507083 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qphpp"] Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.516370 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6e8c5a-b937-4072-902d-28e056de16d2" containerName="collect-profiles" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.516398 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6e8c5a-b937-4072-902d-28e056de16d2" containerName="collect-profiles" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.516495 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6e8c5a-b937-4072-902d-28e056de16d2" containerName="collect-profiles" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.517253 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.508217 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 06:53:41.008163957 +0000 UTC m=+159.270051586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-p5786" (UID: "be028739-1351-4883-95ec-35fb89831c72") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.507890 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.518082 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.518186 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.518299 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8gcv\" (UniqueName: \"kubernetes.io/projected/8a620056-2e2e-46ae-9a32-c8aea4b297c4-kube-api-access-m8gcv\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.518351 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-catalog-content\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.518402 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-utilities\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.518955 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-utilities\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.519005 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.519953 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-catalog-content\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.521366 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qphpp"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.549582 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.561011 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8gcv\" (UniqueName: \"kubernetes.io/projected/8a620056-2e2e-46ae-9a32-c8aea4b297c4-kube-api-access-m8gcv\") pod \"community-operators-btjcd\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.577890 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.618795 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6e8c5a-b937-4072-902d-28e056de16d2-secret-volume\") pod \"4e6e8c5a-b937-4072-902d-28e056de16d2\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.619142 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp5sv\" (UniqueName: \"kubernetes.io/projected/4e6e8c5a-b937-4072-902d-28e056de16d2-kube-api-access-sp5sv\") pod \"4e6e8c5a-b937-4072-902d-28e056de16d2\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.619215 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6e8c5a-b937-4072-902d-28e056de16d2-config-volume\") pod \"4e6e8c5a-b937-4072-902d-28e056de16d2\" (UID: \"4e6e8c5a-b937-4072-902d-28e056de16d2\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.619324 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.619462 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-utilities\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.619543 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkl4\" (UniqueName: \"kubernetes.io/projected/ffdc85ec-5987-47f0-af71-9896f60cb294-kube-api-access-wdkl4\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.619573 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-catalog-content\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.619670 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 06:53:41.119655631 +0000 UTC m=+159.381543260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.620179 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e6e8c5a-b937-4072-902d-28e056de16d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "4e6e8c5a-b937-4072-902d-28e056de16d2" (UID: "4e6e8c5a-b937-4072-902d-28e056de16d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.623112 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e6e8c5a-b937-4072-902d-28e056de16d2-kube-api-access-sp5sv" (OuterVolumeSpecName: "kube-api-access-sp5sv") pod "4e6e8c5a-b937-4072-902d-28e056de16d2" (UID: "4e6e8c5a-b937-4072-902d-28e056de16d2"). InnerVolumeSpecName "kube-api-access-sp5sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.627336 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e6e8c5a-b937-4072-902d-28e056de16d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4e6e8c5a-b937-4072-902d-28e056de16d2" (UID: "4e6e8c5a-b937-4072-902d-28e056de16d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.651145 4789 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-16T06:53:40.253269177Z","Handler":null,"Name":""} Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.671407 4789 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.671432 4789 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.686789 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.694120 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dmjv4"] Dec 16 06:53:40 crc kubenswrapper[4789]: E1216 06:53:40.694361 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a756c0f-e2a4-46d1-837a-a6f9ed694f73" containerName="pruner" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.694374 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a756c0f-e2a4-46d1-837a-a6f9ed694f73" containerName="pruner" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.694470 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a756c0f-e2a4-46d1-837a-a6f9ed694f73" containerName="pruner" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.697089 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.710609 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.714045 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dmjv4"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.720992 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kubelet-dir\") pod \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.721276 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kube-api-access\") pod \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\" (UID: \"4a756c0f-e2a4-46d1-837a-a6f9ed694f73\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722017 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4a756c0f-e2a4-46d1-837a-a6f9ed694f73" (UID: "4a756c0f-e2a4-46d1-837a-a6f9ed694f73"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722125 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-utilities\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722200 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722217 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkl4\" (UniqueName: \"kubernetes.io/projected/ffdc85ec-5987-47f0-af71-9896f60cb294-kube-api-access-wdkl4\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722240 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-catalog-content\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722282 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e6e8c5a-b937-4072-902d-28e056de16d2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722292 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp5sv\" (UniqueName: \"kubernetes.io/projected/4e6e8c5a-b937-4072-902d-28e056de16d2-kube-api-access-sp5sv\") on node \"crc\" DevicePath \"\"" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722302 4789 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722311 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e6e8c5a-b937-4072-902d-28e056de16d2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.722690 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-catalog-content\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.724114 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-utilities\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.726577 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.726606 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.727519 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4a756c0f-e2a4-46d1-837a-a6f9ed694f73" (UID: "4a756c0f-e2a4-46d1-837a-a6f9ed694f73"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.747103 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkl4\" (UniqueName: \"kubernetes.io/projected/ffdc85ec-5987-47f0-af71-9896f60cb294-kube-api-access-wdkl4\") pod \"certified-operators-qphpp\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.758845 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-p5786\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.823470 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.823761 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-utilities\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.823852 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-catalog-content\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.823876 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kklnz\" (UniqueName: \"kubernetes.io/projected/c18aa314-a66a-4cb6-95ec-d605e999b29f-kube-api-access-kklnz\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.824057 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4a756c0f-e2a4-46d1-837a-a6f9ed694f73-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.825327 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5r7k7"] Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.849057 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.864066 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.889368 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.890004 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7k7" event={"ID":"9af33cc0-7e86-482a-b3a1-89df07600676","Type":"ContainerStarted","Data":"d35afea1d0dfa60d63d3f88635c6bec1d5234cc2970104a6a72e1d9275a8e752"} Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.902403 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4a756c0f-e2a4-46d1-837a-a6f9ed694f73","Type":"ContainerDied","Data":"2000f459ad553be4ce83501d21b530eddc8cd9847c97a4f76ac80906a2e6f272"} Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.902464 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2000f459ad553be4ce83501d21b530eddc8cd9847c97a4f76ac80906a2e6f272" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.902540 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.919123 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.919450 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" event={"ID":"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae","Type":"ContainerStarted","Data":"4fa3e0a0fc8b46512d28693fb0c9c908022632282f55847407e399ff33e610f4"} Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.919471 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" event={"ID":"8077458f-c8a9-4cc7-a8b1-4ae376d5e5ae","Type":"ContainerStarted","Data":"a55b17594d7a87859f0a64ee5e7210a09f96bb0195617ef901e7eade68da01e0"} Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.924625 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-catalog-content\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.924674 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kklnz\" (UniqueName: \"kubernetes.io/projected/c18aa314-a66a-4cb6-95ec-d605e999b29f-kube-api-access-kklnz\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.924724 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-utilities\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.925100 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-utilities\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.925307 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-catalog-content\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.931470 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.934211 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv" event={"ID":"4e6e8c5a-b937-4072-902d-28e056de16d2","Type":"ContainerDied","Data":"fdb9873985c6b8d0f9f3ef053d6eb8451198c0e54057ebbd360f6969b9d250ed"} Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.934237 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb9873985c6b8d0f9f3ef053d6eb8451198c0e54057ebbd360f6969b9d250ed" Dec 16 06:53:40 crc kubenswrapper[4789]: I1216 06:53:40.982040 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kklnz\" (UniqueName: \"kubernetes.io/projected/c18aa314-a66a-4cb6-95ec-d605e999b29f-kube-api-access-kklnz\") pod \"community-operators-dmjv4\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.025873 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.097552 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7stjw" podStartSLOduration=18.097534687 podStartE2EDuration="18.097534687s" podCreationTimestamp="2025-12-16 06:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:40.993369642 +0000 UTC m=+159.255257271" watchObservedRunningTime="2025-12-16 06:53:41.097534687 +0000 UTC m=+159.359422316" Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.104722 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.165688 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btjcd"] Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.265372 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:41 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:41 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:41 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.265445 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.386901 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dmjv4"] Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.404868 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.405156 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.410895 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:41 crc kubenswrapper[4789]: W1216 06:53:41.443945 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18aa314_a66a_4cb6_95ec_d605e999b29f.slice/crio-aa30df2d66620c98efe965fa1f73f21334ebd90701fa55796c1958efdc5ae769 WatchSource:0}: Error finding container aa30df2d66620c98efe965fa1f73f21334ebd90701fa55796c1958efdc5ae769: Status 404 returned error can't find the container with id aa30df2d66620c98efe965fa1f73f21334ebd90701fa55796c1958efdc5ae769 Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.537984 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qphpp"] Dec 16 06:53:41 crc kubenswrapper[4789]: W1216 06:53:41.544980 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffdc85ec_5987_47f0_af71_9896f60cb294.slice/crio-eaf0c637801ac21fd1e5adf7bc07584d25d0bebe37c4b89bf859602f815a20e2 WatchSource:0}: Error finding container eaf0c637801ac21fd1e5adf7bc07584d25d0bebe37c4b89bf859602f815a20e2: Status 404 returned error can't find the container with id eaf0c637801ac21fd1e5adf7bc07584d25d0bebe37c4b89bf859602f815a20e2 Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.553904 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5786"] Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.938345 4789 generic.go:334] "Generic (PLEG): container finished" podID="9af33cc0-7e86-482a-b3a1-89df07600676" containerID="869b58f98046f7050c25b7f1658ad22c03a664cdad95b89562eaddce0216ee9d" exitCode=0 Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.938604 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7k7" event={"ID":"9af33cc0-7e86-482a-b3a1-89df07600676","Type":"ContainerDied","Data":"869b58f98046f7050c25b7f1658ad22c03a664cdad95b89562eaddce0216ee9d"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.939863 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.940047 4789 generic.go:334] "Generic (PLEG): container finished" podID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerID="5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b" exitCode=0 Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.940112 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qphpp" event={"ID":"ffdc85ec-5987-47f0-af71-9896f60cb294","Type":"ContainerDied","Data":"5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.940133 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qphpp" event={"ID":"ffdc85ec-5987-47f0-af71-9896f60cb294","Type":"ContainerStarted","Data":"eaf0c637801ac21fd1e5adf7bc07584d25d0bebe37c4b89bf859602f815a20e2"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.941416 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" event={"ID":"be028739-1351-4883-95ec-35fb89831c72","Type":"ContainerStarted","Data":"7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.941446 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" event={"ID":"be028739-1351-4883-95ec-35fb89831c72","Type":"ContainerStarted","Data":"8f51d0c1db093f198653e4c032c99d097e45ae41066d6d75da56e63ba42df4c3"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.941497 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.942849 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"afea063b-95be-4dca-b2f0-dda1b3c7e3f2","Type":"ContainerStarted","Data":"8cd27ab9553914e3d6f881108b5b55bd135730bf9d9238286a898096166119f0"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.942890 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"afea063b-95be-4dca-b2f0-dda1b3c7e3f2","Type":"ContainerStarted","Data":"863c6994f30a67573406fa298c16760c554c666bf00e459869a12f0ef332b487"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.944718 4789 generic.go:334] "Generic (PLEG): container finished" podID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerID="c00edfc76110dd75e59e6631c712b3642f561d89a273a76a53259b76ada30a7e" exitCode=0 Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.944779 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmjv4" event={"ID":"c18aa314-a66a-4cb6-95ec-d605e999b29f","Type":"ContainerDied","Data":"c00edfc76110dd75e59e6631c712b3642f561d89a273a76a53259b76ada30a7e"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.944803 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmjv4" event={"ID":"c18aa314-a66a-4cb6-95ec-d605e999b29f","Type":"ContainerStarted","Data":"aa30df2d66620c98efe965fa1f73f21334ebd90701fa55796c1958efdc5ae769"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.946192 4789 generic.go:334] "Generic (PLEG): container finished" podID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerID="873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534" exitCode=0 Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.946258 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btjcd" event={"ID":"8a620056-2e2e-46ae-9a32-c8aea4b297c4","Type":"ContainerDied","Data":"873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.946285 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btjcd" event={"ID":"8a620056-2e2e-46ae-9a32-c8aea4b297c4","Type":"ContainerStarted","Data":"0500c04f6352d976e18daadcf259e8ca2fe642164bc0f2493aec73fa72e25044"} Dec 16 06:53:41 crc kubenswrapper[4789]: I1216 06:53:41.952048 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6nzjt" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.024003 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.023981896 podStartE2EDuration="2.023981896s" podCreationTimestamp="2025-12-16 06:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:41.992456601 +0000 UTC m=+160.254344250" watchObservedRunningTime="2025-12-16 06:53:42.023981896 +0000 UTC m=+160.285869525" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.024655 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" podStartSLOduration=141.024649061 podStartE2EDuration="2m21.024649061s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:42.022811709 +0000 UTC m=+160.284699328" watchObservedRunningTime="2025-12-16 06:53:42.024649061 +0000 UTC m=+160.286536690" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.092679 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5wdfg"] Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.093619 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.096221 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.112045 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.112442 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wdfg"] Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.152634 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-catalog-content\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.152706 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqtq\" (UniqueName: \"kubernetes.io/projected/706aa82a-2280-4715-919c-a480a2a81f8d-kube-api-access-8fqtq\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.152736 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-utilities\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.254454 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqtq\" (UniqueName: \"kubernetes.io/projected/706aa82a-2280-4715-919c-a480a2a81f8d-kube-api-access-8fqtq\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.254782 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-utilities\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.254852 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-catalog-content\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.255342 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-catalog-content\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.255413 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-utilities\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.264374 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:42 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:42 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:42 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.264443 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.271546 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqtq\" (UniqueName: \"kubernetes.io/projected/706aa82a-2280-4715-919c-a480a2a81f8d-kube-api-access-8fqtq\") pod \"redhat-marketplace-5wdfg\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.408801 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.499167 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dg8zr"] Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.500384 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.506964 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dg8zr"] Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.558993 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxrx\" (UniqueName: \"kubernetes.io/projected/f2a44056-0b8f-4209-b92d-cfb1110ba626-kube-api-access-fdxrx\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.559054 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-catalog-content\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.559086 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-utilities\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.604552 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wdfg"] Dec 16 06:53:42 crc kubenswrapper[4789]: W1216 06:53:42.610439 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706aa82a_2280_4715_919c_a480a2a81f8d.slice/crio-3d42e9d8217933efc87c18af42abbf56f4c0775814b6daa2e0aa407fe21a5abe WatchSource:0}: Error finding container 3d42e9d8217933efc87c18af42abbf56f4c0775814b6daa2e0aa407fe21a5abe: Status 404 returned error can't find the container with id 3d42e9d8217933efc87c18af42abbf56f4c0775814b6daa2e0aa407fe21a5abe Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.660041 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdxrx\" (UniqueName: \"kubernetes.io/projected/f2a44056-0b8f-4209-b92d-cfb1110ba626-kube-api-access-fdxrx\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.660100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-catalog-content\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.660133 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-utilities\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.660478 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-utilities\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.661346 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-catalog-content\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.678470 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdxrx\" (UniqueName: \"kubernetes.io/projected/f2a44056-0b8f-4209-b92d-cfb1110ba626-kube-api-access-fdxrx\") pod \"redhat-marketplace-dg8zr\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.825976 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.865542 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.870362 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/21ceea53-d8d0-48a9-8c27-5cdd1028f0b7-metrics-certs\") pod \"network-metrics-daemon-ttcm5\" (UID: \"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7\") " pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.955359 4789 generic.go:334] "Generic (PLEG): container finished" podID="afea063b-95be-4dca-b2f0-dda1b3c7e3f2" containerID="8cd27ab9553914e3d6f881108b5b55bd135730bf9d9238286a898096166119f0" exitCode=0 Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.955642 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"afea063b-95be-4dca-b2f0-dda1b3c7e3f2","Type":"ContainerDied","Data":"8cd27ab9553914e3d6f881108b5b55bd135730bf9d9238286a898096166119f0"} Dec 16 06:53:42 crc kubenswrapper[4789]: I1216 06:53:42.960212 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wdfg" event={"ID":"706aa82a-2280-4715-919c-a480a2a81f8d","Type":"ContainerStarted","Data":"3d42e9d8217933efc87c18af42abbf56f4c0775814b6daa2e0aa407fe21a5abe"} Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.018024 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ttcm5" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.057407 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dg8zr"] Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.210332 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ttcm5"] Dec 16 06:53:43 crc kubenswrapper[4789]: W1216 06:53:43.215081 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ceea53_d8d0_48a9_8c27_5cdd1028f0b7.slice/crio-0e426a0278a4dbffb2a69876ba00adb6598fc01222e2bbc72c01d22f4d3f4a49 WatchSource:0}: Error finding container 0e426a0278a4dbffb2a69876ba00adb6598fc01222e2bbc72c01d22f4d3f4a49: Status 404 returned error can't find the container with id 0e426a0278a4dbffb2a69876ba00adb6598fc01222e2bbc72c01d22f4d3f4a49 Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.264306 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:43 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:43 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:43 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.264364 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.285500 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lw2sm"] Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.286811 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.288400 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.293721 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw2sm"] Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.372295 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-utilities\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.372442 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-catalog-content\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.372500 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcct\" (UniqueName: \"kubernetes.io/projected/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-kube-api-access-9fcct\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.473888 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-utilities\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.474030 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-catalog-content\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.474519 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-catalog-content\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.474562 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcct\" (UniqueName: \"kubernetes.io/projected/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-kube-api-access-9fcct\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.474964 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-utilities\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.496856 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcct\" (UniqueName: \"kubernetes.io/projected/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-kube-api-access-9fcct\") pod \"redhat-operators-lw2sm\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.608689 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.685221 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6lz9"] Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.686246 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.732480 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6lz9"] Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.777645 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-utilities\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.777817 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-catalog-content\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.777865 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489xf\" (UniqueName: \"kubernetes.io/projected/27e6916f-d4df-4de1-8781-b7efdc23fff9-kube-api-access-489xf\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.879699 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-catalog-content\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.879753 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-489xf\" (UniqueName: \"kubernetes.io/projected/27e6916f-d4df-4de1-8781-b7efdc23fff9-kube-api-access-489xf\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.879879 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-utilities\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.880778 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-utilities\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.881127 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-catalog-content\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.907505 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-489xf\" (UniqueName: \"kubernetes.io/projected/27e6916f-d4df-4de1-8781-b7efdc23fff9-kube-api-access-489xf\") pod \"redhat-operators-p6lz9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.976103 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" event={"ID":"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7","Type":"ContainerStarted","Data":"0e426a0278a4dbffb2a69876ba00adb6598fc01222e2bbc72c01d22f4d3f4a49"} Dec 16 06:53:43 crc kubenswrapper[4789]: I1216 06:53:43.977700 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dg8zr" event={"ID":"f2a44056-0b8f-4209-b92d-cfb1110ba626","Type":"ContainerStarted","Data":"dc1364c001d4d2c7d6fabfab30957f40abe27ec7e2f16afcca5d55f700b07ab7"} Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.002573 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.263559 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:44 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:44 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:44 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.263833 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.379974 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.490440 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kube-api-access\") pod \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.490510 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kubelet-dir\") pod \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\" (UID: \"afea063b-95be-4dca-b2f0-dda1b3c7e3f2\") " Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.490839 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "afea063b-95be-4dca-b2f0-dda1b3c7e3f2" (UID: "afea063b-95be-4dca-b2f0-dda1b3c7e3f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.516131 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "afea063b-95be-4dca-b2f0-dda1b3c7e3f2" (UID: "afea063b-95be-4dca-b2f0-dda1b3c7e3f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.593822 4789 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.594264 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afea063b-95be-4dca-b2f0-dda1b3c7e3f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.806998 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6lz9"] Dec 16 06:53:44 crc kubenswrapper[4789]: W1216 06:53:44.816652 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e6916f_d4df_4de1_8781_b7efdc23fff9.slice/crio-272bc472c64b765c6435ae05a02c66e7caafbe1cd9dee64fc4711815ec9bea28 WatchSource:0}: Error finding container 272bc472c64b765c6435ae05a02c66e7caafbe1cd9dee64fc4711815ec9bea28: Status 404 returned error can't find the container with id 272bc472c64b765c6435ae05a02c66e7caafbe1cd9dee64fc4711815ec9bea28 Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.986463 4789 generic.go:334] "Generic (PLEG): container finished" podID="706aa82a-2280-4715-919c-a480a2a81f8d" containerID="d5997445e0f915ffcc6f95c6399f43a797da82179516d62f02ff05af3bf39f33" exitCode=0 Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.986520 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wdfg" event={"ID":"706aa82a-2280-4715-919c-a480a2a81f8d","Type":"ContainerDied","Data":"d5997445e0f915ffcc6f95c6399f43a797da82179516d62f02ff05af3bf39f33"} Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.988126 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6lz9" event={"ID":"27e6916f-d4df-4de1-8781-b7efdc23fff9","Type":"ContainerStarted","Data":"272bc472c64b765c6435ae05a02c66e7caafbe1cd9dee64fc4711815ec9bea28"} Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.991257 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"afea063b-95be-4dca-b2f0-dda1b3c7e3f2","Type":"ContainerDied","Data":"863c6994f30a67573406fa298c16760c554c666bf00e459869a12f0ef332b487"} Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.991284 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.991297 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="863c6994f30a67573406fa298c16760c554c666bf00e459869a12f0ef332b487" Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.994354 4789 generic.go:334] "Generic (PLEG): container finished" podID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerID="72e6f82fa8014408b7e6f722af7f3c1036333cc4581c46a769b6a7eff65e09be" exitCode=0 Dec 16 06:53:44 crc kubenswrapper[4789]: I1216 06:53:44.994387 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dg8zr" event={"ID":"f2a44056-0b8f-4209-b92d-cfb1110ba626","Type":"ContainerDied","Data":"72e6f82fa8014408b7e6f722af7f3c1036333cc4581c46a769b6a7eff65e09be"} Dec 16 06:53:45 crc kubenswrapper[4789]: I1216 06:53:45.092463 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw2sm"] Dec 16 06:53:45 crc kubenswrapper[4789]: W1216 06:53:45.112137 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca1b1396_77d5_4d37_a72b_fcb9591cbf40.slice/crio-fd9c5fb4e313087368a94e12220776fa5456f4892c7dfc0eb59b4659714562ea WatchSource:0}: Error finding container fd9c5fb4e313087368a94e12220776fa5456f4892c7dfc0eb59b4659714562ea: Status 404 returned error can't find the container with id fd9c5fb4e313087368a94e12220776fa5456f4892c7dfc0eb59b4659714562ea Dec 16 06:53:45 crc kubenswrapper[4789]: I1216 06:53:45.266218 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 06:53:45 crc kubenswrapper[4789]: [-]has-synced failed: reason withheld Dec 16 06:53:45 crc kubenswrapper[4789]: [+]process-running ok Dec 16 06:53:45 crc kubenswrapper[4789]: healthz check failed Dec 16 06:53:45 crc kubenswrapper[4789]: I1216 06:53:45.266326 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 06:53:45 crc kubenswrapper[4789]: I1216 06:53:45.959646 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-66bwg" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.036719 4789 generic.go:334] "Generic (PLEG): container finished" podID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerID="a4a2374f1fd1d284bc72262bdcee9eef9879e3f8933b20312ae2bd2cb1afa1c9" exitCode=0 Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.036801 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6lz9" event={"ID":"27e6916f-d4df-4de1-8781-b7efdc23fff9","Type":"ContainerDied","Data":"a4a2374f1fd1d284bc72262bdcee9eef9879e3f8933b20312ae2bd2cb1afa1c9"} Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.040157 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" event={"ID":"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7","Type":"ContainerStarted","Data":"bad9b3247582328ec038d32c962c708675667b100c86ed244f1374159b557434"} Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.042092 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerID="f491a6d37655c9eae25e2da1ed574968d517686d0b91eec7bd6800ba205e2595" exitCode=0 Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.042129 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw2sm" event={"ID":"ca1b1396-77d5-4d37-a72b-fcb9591cbf40","Type":"ContainerDied","Data":"f491a6d37655c9eae25e2da1ed574968d517686d0b91eec7bd6800ba205e2595"} Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.042158 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw2sm" event={"ID":"ca1b1396-77d5-4d37-a72b-fcb9591cbf40","Type":"ContainerStarted","Data":"fd9c5fb4e313087368a94e12220776fa5456f4892c7dfc0eb59b4659714562ea"} Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.278693 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.281606 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pwj9t" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.338834 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.345644 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.694155 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.694520 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.696770 4789 patch_prober.go:28] interesting pod/console-f9d7485db-cbqb2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.696833 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-cbqb2" podUID="28e992ee-e81f-46d7-b422-27fa3023b7d8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Dec 16 06:53:46 crc kubenswrapper[4789]: I1216 06:53:46.718727 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:53:47 crc kubenswrapper[4789]: I1216 06:53:47.077487 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ttcm5" event={"ID":"21ceea53-d8d0-48a9-8c27-5cdd1028f0b7","Type":"ContainerStarted","Data":"5e769b790122f79d037ab655e5687bbc5b61caf8a2c84d5b6c0e9d3fe700adf0"} Dec 16 06:53:47 crc kubenswrapper[4789]: I1216 06:53:47.102732 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ttcm5" podStartSLOduration=146.102714817 podStartE2EDuration="2m26.102714817s" podCreationTimestamp="2025-12-16 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:53:47.101058359 +0000 UTC m=+165.362945988" watchObservedRunningTime="2025-12-16 06:53:47.102714817 +0000 UTC m=+165.364602446" Dec 16 06:53:47 crc kubenswrapper[4789]: I1216 06:53:47.784755 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-98ssf" Dec 16 06:53:51 crc kubenswrapper[4789]: I1216 06:53:51.928067 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:53:51 crc kubenswrapper[4789]: I1216 06:53:51.928457 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:53:52 crc kubenswrapper[4789]: I1216 06:53:52.175118 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6ctc"] Dec 16 06:53:52 crc kubenswrapper[4789]: I1216 06:53:52.175424 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" podUID="2cbecadb-0f2a-443e-b065-edc627985d96" containerName="controller-manager" containerID="cri-o://8761244a5e8a9f5eddfc512b06f82d2e14b47656c096730a7fa70012c80fe510" gracePeriod=30 Dec 16 06:53:52 crc kubenswrapper[4789]: I1216 06:53:52.190343 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh"] Dec 16 06:53:52 crc kubenswrapper[4789]: I1216 06:53:52.192862 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" containerID="cri-o://01eb85cf7c8025f23a704fb0fb3111decb0621182b844fb9c22307e77df967d8" gracePeriod=30 Dec 16 06:53:53 crc kubenswrapper[4789]: I1216 06:53:53.129491 4789 generic.go:334] "Generic (PLEG): container finished" podID="2cbecadb-0f2a-443e-b065-edc627985d96" containerID="8761244a5e8a9f5eddfc512b06f82d2e14b47656c096730a7fa70012c80fe510" exitCode=0 Dec 16 06:53:53 crc kubenswrapper[4789]: I1216 06:53:53.129594 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" event={"ID":"2cbecadb-0f2a-443e-b065-edc627985d96","Type":"ContainerDied","Data":"8761244a5e8a9f5eddfc512b06f82d2e14b47656c096730a7fa70012c80fe510"} Dec 16 06:53:53 crc kubenswrapper[4789]: I1216 06:53:53.135698 4789 generic.go:334] "Generic (PLEG): container finished" podID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerID="01eb85cf7c8025f23a704fb0fb3111decb0621182b844fb9c22307e77df967d8" exitCode=0 Dec 16 06:53:53 crc kubenswrapper[4789]: I1216 06:53:53.135749 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" event={"ID":"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b","Type":"ContainerDied","Data":"01eb85cf7c8025f23a704fb0fb3111decb0621182b844fb9c22307e77df967d8"} Dec 16 06:53:56 crc kubenswrapper[4789]: I1216 06:53:56.697949 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:56 crc kubenswrapper[4789]: I1216 06:53:56.702158 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 06:53:56 crc kubenswrapper[4789]: I1216 06:53:56.793750 4789 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-kpcbh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:53:56 crc kubenswrapper[4789]: I1216 06:53:56.793830 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:53:57 crc kubenswrapper[4789]: I1216 06:53:57.338103 4789 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l6ctc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: i/o timeout" start-of-body= Dec 16 06:53:57 crc kubenswrapper[4789]: I1216 06:53:57.338150 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" podUID="2cbecadb-0f2a-443e-b065-edc627985d96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: i/o timeout" Dec 16 06:54:00 crc kubenswrapper[4789]: I1216 06:54:00.898552 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:54:06 crc kubenswrapper[4789]: I1216 06:54:06.793654 4789 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-kpcbh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:54:06 crc kubenswrapper[4789]: I1216 06:54:06.794190 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:54:07 crc kubenswrapper[4789]: I1216 06:54:07.338773 4789 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l6ctc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:54:07 crc kubenswrapper[4789]: I1216 06:54:07.338845 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" podUID="2cbecadb-0f2a-443e-b065-edc627985d96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:54:08 crc kubenswrapper[4789]: I1216 06:54:08.651570 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 06:54:11 crc kubenswrapper[4789]: E1216 06:54:11.709537 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 06:54:11 crc kubenswrapper[4789]: E1216 06:54:11.709748 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wdkl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qphpp_openshift-marketplace(ffdc85ec-5987-47f0-af71-9896f60cb294): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:54:11 crc kubenswrapper[4789]: E1216 06:54:11.711060 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qphpp" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" Dec 16 06:54:12 crc kubenswrapper[4789]: E1216 06:54:12.712978 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qphpp" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" Dec 16 06:54:12 crc kubenswrapper[4789]: E1216 06:54:12.756591 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 06:54:12 crc kubenswrapper[4789]: E1216 06:54:12.756997 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8gcv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-btjcd_openshift-marketplace(8a620056-2e2e-46ae-9a32-c8aea4b297c4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:54:12 crc kubenswrapper[4789]: E1216 06:54:12.759497 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-btjcd" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" Dec 16 06:54:12 crc kubenswrapper[4789]: E1216 06:54:12.879861 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 06:54:12 crc kubenswrapper[4789]: E1216 06:54:12.880022 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fqtq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5wdfg_openshift-marketplace(706aa82a-2280-4715-919c-a480a2a81f8d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:54:12 crc kubenswrapper[4789]: E1216 06:54:12.881569 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5wdfg" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" Dec 16 06:54:14 crc kubenswrapper[4789]: I1216 06:54:14.872595 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 06:54:14 crc kubenswrapper[4789]: E1216 06:54:14.873470 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afea063b-95be-4dca-b2f0-dda1b3c7e3f2" containerName="pruner" Dec 16 06:54:14 crc kubenswrapper[4789]: I1216 06:54:14.873488 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="afea063b-95be-4dca-b2f0-dda1b3c7e3f2" containerName="pruner" Dec 16 06:54:14 crc kubenswrapper[4789]: I1216 06:54:14.873970 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="afea063b-95be-4dca-b2f0-dda1b3c7e3f2" containerName="pruner" Dec 16 06:54:14 crc kubenswrapper[4789]: I1216 06:54:14.874829 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:14 crc kubenswrapper[4789]: I1216 06:54:14.897494 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 06:54:14 crc kubenswrapper[4789]: I1216 06:54:14.897779 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 06:54:14 crc kubenswrapper[4789]: I1216 06:54:14.903385 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 06:54:15 crc kubenswrapper[4789]: I1216 06:54:15.046860 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:15 crc kubenswrapper[4789]: I1216 06:54:15.046975 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:15 crc kubenswrapper[4789]: I1216 06:54:15.147843 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:15 crc kubenswrapper[4789]: I1216 06:54:15.147953 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:15 crc kubenswrapper[4789]: I1216 06:54:15.148009 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:15 crc kubenswrapper[4789]: I1216 06:54:15.166654 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:15 crc kubenswrapper[4789]: I1216 06:54:15.214973 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.062983 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5wdfg" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.063001 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-btjcd" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.140592 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.145286 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.164066 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.164367 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdxrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dg8zr_openshift-marketplace(f2a44056-0b8f-4209-b92d-cfb1110ba626): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.165642 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dg8zr" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.188444 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt"] Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.188708 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.188726 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.188750 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbecadb-0f2a-443e-b065-edc627985d96" containerName="controller-manager" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.188757 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbecadb-0f2a-443e-b065-edc627985d96" containerName="controller-manager" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.188855 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.188865 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cbecadb-0f2a-443e-b065-edc627985d96" containerName="controller-manager" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.189302 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.192855 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt"] Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.197234 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.197386 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bblh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5r7k7_openshift-marketplace(9af33cc0-7e86-482a-b3a1-89df07600676): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 06:54:16 crc kubenswrapper[4789]: E1216 06:54:16.198556 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5r7k7" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.262233 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.262223 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" event={"ID":"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b","Type":"ContainerDied","Data":"8a7fb95d9ac2ed889165af45ac4b86a1843b364d387ee4ddacfe9530a58b0d99"} Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.262314 4789 scope.go:117] "RemoveContainer" containerID="01eb85cf7c8025f23a704fb0fb3111decb0621182b844fb9c22307e77df967d8" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.262701 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkdcv\" (UniqueName: \"kubernetes.io/projected/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-kube-api-access-tkdcv\") pod \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.262739 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp5lc\" (UniqueName: \"kubernetes.io/projected/2cbecadb-0f2a-443e-b065-edc627985d96-kube-api-access-jp5lc\") pod \"2cbecadb-0f2a-443e-b065-edc627985d96\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.262777 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-serving-cert\") pod \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.262830 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-config\") pod \"2cbecadb-0f2a-443e-b065-edc627985d96\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.263008 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-client-ca\") pod \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.263605 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-client-ca" (OuterVolumeSpecName: "client-ca") pod "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" (UID: "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.263040 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cbecadb-0f2a-443e-b065-edc627985d96-serving-cert\") pod \"2cbecadb-0f2a-443e-b065-edc627985d96\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.263691 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-proxy-ca-bundles\") pod \"2cbecadb-0f2a-443e-b065-edc627985d96\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.263697 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-config" (OuterVolumeSpecName: "config") pod "2cbecadb-0f2a-443e-b065-edc627985d96" (UID: "2cbecadb-0f2a-443e-b065-edc627985d96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.264269 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2cbecadb-0f2a-443e-b065-edc627985d96" (UID: "2cbecadb-0f2a-443e-b065-edc627985d96"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.264350 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-config\") pod \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\" (UID: \"2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.264375 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-client-ca\") pod \"2cbecadb-0f2a-443e-b065-edc627985d96\" (UID: \"2cbecadb-0f2a-443e-b065-edc627985d96\") " Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.264972 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-config" (OuterVolumeSpecName: "config") pod "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" (UID: "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.265004 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cbecadb-0f2a-443e-b065-edc627985d96" (UID: "2cbecadb-0f2a-443e-b065-edc627985d96"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.266008 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.266026 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.266040 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.266053 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.266065 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2cbecadb-0f2a-443e-b065-edc627985d96-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.270296 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" event={"ID":"2cbecadb-0f2a-443e-b065-edc627985d96","Type":"ContainerDied","Data":"7a29fee741ba4b33ab36257fb079fe0d2e8a25e1eaa33a83c11b7d5c22190099"} Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.270509 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6ctc" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.271283 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbecadb-0f2a-443e-b065-edc627985d96-kube-api-access-jp5lc" (OuterVolumeSpecName: "kube-api-access-jp5lc") pod "2cbecadb-0f2a-443e-b065-edc627985d96" (UID: "2cbecadb-0f2a-443e-b065-edc627985d96"). InnerVolumeSpecName "kube-api-access-jp5lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.271413 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbecadb-0f2a-443e-b065-edc627985d96-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cbecadb-0f2a-443e-b065-edc627985d96" (UID: "2cbecadb-0f2a-443e-b065-edc627985d96"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.272441 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" (UID: "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.272299 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-kube-api-access-tkdcv" (OuterVolumeSpecName: "kube-api-access-tkdcv") pod "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" (UID: "2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b"). InnerVolumeSpecName "kube-api-access-tkdcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.332013 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ptbc9" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367440 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42mrt\" (UniqueName: \"kubernetes.io/projected/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-kube-api-access-42mrt\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367516 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-config\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367538 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-client-ca\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367579 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-serving-cert\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367613 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkdcv\" (UniqueName: \"kubernetes.io/projected/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-kube-api-access-tkdcv\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367659 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp5lc\" (UniqueName: \"kubernetes.io/projected/2cbecadb-0f2a-443e-b065-edc627985d96-kube-api-access-jp5lc\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367669 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.367677 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cbecadb-0f2a-443e-b065-edc627985d96-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.468586 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-config\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.468638 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-client-ca\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.468666 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-serving-cert\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.468713 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42mrt\" (UniqueName: \"kubernetes.io/projected/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-kube-api-access-42mrt\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.469558 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-client-ca\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.469758 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-config\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.481221 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-serving-cert\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.494489 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42mrt\" (UniqueName: \"kubernetes.io/projected/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-kube-api-access-42mrt\") pod \"route-controller-manager-65b5f8b848-r8jgt\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.508362 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.592561 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh"] Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.599367 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh"] Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.606460 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6ctc"] Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.609546 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6ctc"] Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.793594 4789 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-kpcbh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: i/o timeout" start-of-body= Dec 16 06:54:16 crc kubenswrapper[4789]: I1216 06:54:16.793651 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpcbh" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: i/o timeout" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.114033 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cbecadb-0f2a-443e-b065-edc627985d96" path="/var/lib/kubelet/pods/2cbecadb-0f2a-443e-b065-edc627985d96/volumes" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.115015 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b" path="/var/lib/kubelet/pods/2eb2b3d2-a3ca-48f0-b2e6-58e0dcceab5b/volumes" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.294223 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg"] Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.295247 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.300989 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.301140 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.301487 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.302349 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.304505 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg"] Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.349598 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.350321 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.351802 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.391874 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-client-ca\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.391928 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-proxy-ca-bundles\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.392068 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5wm\" (UniqueName: \"kubernetes.io/projected/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-kube-api-access-kv5wm\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.392124 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-config\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.392245 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-serving-cert\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.493036 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-serving-cert\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.493102 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-client-ca\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.493121 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-proxy-ca-bundles\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.493176 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5wm\" (UniqueName: \"kubernetes.io/projected/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-kube-api-access-kv5wm\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.493206 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-config\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.494476 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-client-ca\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.494634 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-proxy-ca-bundles\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.494671 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-config\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.496829 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-serving-cert\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.515150 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5wm\" (UniqueName: \"kubernetes.io/projected/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-kube-api-access-kv5wm\") pod \"controller-manager-676b6cc5d8-qvtwg\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:18 crc kubenswrapper[4789]: I1216 06:54:18.667751 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.073336 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.076838 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.078542 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.201871 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-var-lock\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.202003 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8ad11d7-1675-46d0-9e72-10f68b56a823-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.202055 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.303336 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.303385 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-var-lock\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.303495 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.303519 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-var-lock\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.303561 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8ad11d7-1675-46d0-9e72-10f68b56a823-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.328980 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8ad11d7-1675-46d0-9e72-10f68b56a823-kube-api-access\") pod \"installer-9-crc\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.401776 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:19 crc kubenswrapper[4789]: E1216 06:54:19.735282 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5r7k7" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" Dec 16 06:54:19 crc kubenswrapper[4789]: E1216 06:54:19.735294 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dg8zr" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" Dec 16 06:54:19 crc kubenswrapper[4789]: I1216 06:54:19.743351 4789 scope.go:117] "RemoveContainer" containerID="8761244a5e8a9f5eddfc512b06f82d2e14b47656c096730a7fa70012c80fe510" Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.053851 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.145297 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt"] Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.148362 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.158948 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg"] Dec 16 06:54:20 crc kubenswrapper[4789]: W1216 06:54:20.178137 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca9c7707_0f10_4a53_b2f2_d8cf7c37ab6c.slice/crio-32ef8b25913818061be06c09711d217be5be34ce777adc18b0ed9d1b983c77e8 WatchSource:0}: Error finding container 32ef8b25913818061be06c09711d217be5be34ce777adc18b0ed9d1b983c77e8: Status 404 returned error can't find the container with id 32ef8b25913818061be06c09711d217be5be34ce777adc18b0ed9d1b983c77e8 Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.291508 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmjv4" event={"ID":"c18aa314-a66a-4cb6-95ec-d605e999b29f","Type":"ContainerStarted","Data":"fabca29482fe20c154047a05715c0c3c89fdd9766b31627d843a9eeb7b9e67c5"} Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.295729 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" event={"ID":"67692a1d-1522-4b3c-bd9e-a6151d1ef98c","Type":"ContainerStarted","Data":"d24ae2302f6d5c6049e91e03f2da0ff4e1fae05aa56f89f6dc654e8ff1ba9e88"} Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.298575 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6lz9" event={"ID":"27e6916f-d4df-4de1-8781-b7efdc23fff9","Type":"ContainerStarted","Data":"d154c5004ec04a293b925aecd4291d0bfc3df9a06439f9b0d5b141fa4829ce35"} Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.300805 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"03da8f96-51e6-4bfb-8d36-2d1cb855cf99","Type":"ContainerStarted","Data":"5b7bbbe90f264bd34332704de6e880ce4b8f5f1b34ff42aff3c884aeec3b62e9"} Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.304446 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw2sm" event={"ID":"ca1b1396-77d5-4d37-a72b-fcb9591cbf40","Type":"ContainerStarted","Data":"78c0f9d46053800a938b83570d2c3a657a36744d819e6b4c1e464ed6fbd9ec08"} Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.306052 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" event={"ID":"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c","Type":"ContainerStarted","Data":"32ef8b25913818061be06c09711d217be5be34ce777adc18b0ed9d1b983c77e8"} Dec 16 06:54:20 crc kubenswrapper[4789]: I1216 06:54:20.307299 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8ad11d7-1675-46d0-9e72-10f68b56a823","Type":"ContainerStarted","Data":"93fceea3a4444d010bb488159be0c9e74044aa0699d68e46b4028684afd2c7ca"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.314004 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" event={"ID":"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c","Type":"ContainerStarted","Data":"608a0929d4464dd2aa54974f9a6bba308991697d1203b9d809abfa7468851dd5"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.314483 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.317704 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8ad11d7-1675-46d0-9e72-10f68b56a823","Type":"ContainerStarted","Data":"85986a76a7d20ecb3d62797a0545ca75cb75644974cddf74c146bb56e8260ab1"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.320064 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.320366 4789 generic.go:334] "Generic (PLEG): container finished" podID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerID="d154c5004ec04a293b925aecd4291d0bfc3df9a06439f9b0d5b141fa4829ce35" exitCode=0 Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.320435 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6lz9" event={"ID":"27e6916f-d4df-4de1-8781-b7efdc23fff9","Type":"ContainerDied","Data":"d154c5004ec04a293b925aecd4291d0bfc3df9a06439f9b0d5b141fa4829ce35"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.322655 4789 generic.go:334] "Generic (PLEG): container finished" podID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerID="fabca29482fe20c154047a05715c0c3c89fdd9766b31627d843a9eeb7b9e67c5" exitCode=0 Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.322690 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmjv4" event={"ID":"c18aa314-a66a-4cb6-95ec-d605e999b29f","Type":"ContainerDied","Data":"fabca29482fe20c154047a05715c0c3c89fdd9766b31627d843a9eeb7b9e67c5"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.325983 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" event={"ID":"67692a1d-1522-4b3c-bd9e-a6151d1ef98c","Type":"ContainerStarted","Data":"7456ab684083f1c25507488e0a1c0da6dccfeac630852e327b89636e6e3d8360"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.327770 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.330731 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.331209 4789 generic.go:334] "Generic (PLEG): container finished" podID="03da8f96-51e6-4bfb-8d36-2d1cb855cf99" containerID="939ba6cc17988d6fe82766d56bff8908077f4b940f3062fdf03370ee0e214f0e" exitCode=0 Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.331252 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"03da8f96-51e6-4bfb-8d36-2d1cb855cf99","Type":"ContainerDied","Data":"939ba6cc17988d6fe82766d56bff8908077f4b940f3062fdf03370ee0e214f0e"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.335454 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" podStartSLOduration=10.335437944 podStartE2EDuration="10.335437944s" podCreationTimestamp="2025-12-16 06:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:54:21.332483732 +0000 UTC m=+199.594371361" watchObservedRunningTime="2025-12-16 06:54:21.335437944 +0000 UTC m=+199.597325573" Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.338161 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerID="78c0f9d46053800a938b83570d2c3a657a36744d819e6b4c1e464ed6fbd9ec08" exitCode=0 Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.338199 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw2sm" event={"ID":"ca1b1396-77d5-4d37-a72b-fcb9591cbf40","Type":"ContainerDied","Data":"78c0f9d46053800a938b83570d2c3a657a36744d819e6b4c1e464ed6fbd9ec08"} Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.365392 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.365376042 podStartE2EDuration="2.365376042s" podCreationTimestamp="2025-12-16 06:54:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:54:21.363631674 +0000 UTC m=+199.625519303" watchObservedRunningTime="2025-12-16 06:54:21.365376042 +0000 UTC m=+199.627263671" Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.427377 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" podStartSLOduration=10.427353305 podStartE2EDuration="10.427353305s" podCreationTimestamp="2025-12-16 06:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:54:21.42641415 +0000 UTC m=+199.688301809" watchObservedRunningTime="2025-12-16 06:54:21.427353305 +0000 UTC m=+199.689240934" Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.927866 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:54:21 crc kubenswrapper[4789]: I1216 06:54:21.928180 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.345356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmjv4" event={"ID":"c18aa314-a66a-4cb6-95ec-d605e999b29f","Type":"ContainerStarted","Data":"027b52ddea537705bc074a05e8baeb0cd647251eb8791e9c261e7852ac2d0b97"} Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.347677 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw2sm" event={"ID":"ca1b1396-77d5-4d37-a72b-fcb9591cbf40","Type":"ContainerStarted","Data":"dc68cf5edf6a22837f52641014946cbc974bddf7d3d7cffed5287128ea44d580"} Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.350049 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6lz9" event={"ID":"27e6916f-d4df-4de1-8781-b7efdc23fff9","Type":"ContainerStarted","Data":"adeae7b46c054fdc774e37351351ee37858098ebcb31c6092a857fa3a42d6b41"} Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.368873 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dmjv4" podStartSLOduration=2.248747806 podStartE2EDuration="42.368854931s" podCreationTimestamp="2025-12-16 06:53:40 +0000 UTC" firstStartedPulling="2025-12-16 06:53:41.945925862 +0000 UTC m=+160.207813491" lastFinishedPulling="2025-12-16 06:54:22.066032987 +0000 UTC m=+200.327920616" observedRunningTime="2025-12-16 06:54:22.368177532 +0000 UTC m=+200.630065161" watchObservedRunningTime="2025-12-16 06:54:22.368854931 +0000 UTC m=+200.630742560" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.389144 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6lz9" podStartSLOduration=3.291636377 podStartE2EDuration="39.389122341s" podCreationTimestamp="2025-12-16 06:53:43 +0000 UTC" firstStartedPulling="2025-12-16 06:53:46.039369901 +0000 UTC m=+164.301257530" lastFinishedPulling="2025-12-16 06:54:22.136855865 +0000 UTC m=+200.398743494" observedRunningTime="2025-12-16 06:54:22.389016238 +0000 UTC m=+200.650903867" watchObservedRunningTime="2025-12-16 06:54:22.389122341 +0000 UTC m=+200.651009980" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.406524 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lw2sm" podStartSLOduration=3.324090286 podStartE2EDuration="39.406507333s" podCreationTimestamp="2025-12-16 06:53:43 +0000 UTC" firstStartedPulling="2025-12-16 06:53:46.043391493 +0000 UTC m=+164.305279122" lastFinishedPulling="2025-12-16 06:54:22.12580854 +0000 UTC m=+200.387696169" observedRunningTime="2025-12-16 06:54:22.405300669 +0000 UTC m=+200.667188308" watchObservedRunningTime="2025-12-16 06:54:22.406507333 +0000 UTC m=+200.668394962" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.595057 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.670212 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kube-api-access\") pod \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.670251 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kubelet-dir\") pod \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\" (UID: \"03da8f96-51e6-4bfb-8d36-2d1cb855cf99\") " Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.670581 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "03da8f96-51e6-4bfb-8d36-2d1cb855cf99" (UID: "03da8f96-51e6-4bfb-8d36-2d1cb855cf99"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.675907 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "03da8f96-51e6-4bfb-8d36-2d1cb855cf99" (UID: "03da8f96-51e6-4bfb-8d36-2d1cb855cf99"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.771861 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:22 crc kubenswrapper[4789]: I1216 06:54:22.771906 4789 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03da8f96-51e6-4bfb-8d36-2d1cb855cf99-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:23 crc kubenswrapper[4789]: I1216 06:54:23.357502 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"03da8f96-51e6-4bfb-8d36-2d1cb855cf99","Type":"ContainerDied","Data":"5b7bbbe90f264bd34332704de6e880ce4b8f5f1b34ff42aff3c884aeec3b62e9"} Dec 16 06:54:23 crc kubenswrapper[4789]: I1216 06:54:23.358510 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7bbbe90f264bd34332704de6e880ce4b8f5f1b34ff42aff3c884aeec3b62e9" Dec 16 06:54:23 crc kubenswrapper[4789]: I1216 06:54:23.358055 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 06:54:23 crc kubenswrapper[4789]: I1216 06:54:23.609127 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:54:23 crc kubenswrapper[4789]: I1216 06:54:23.609464 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:54:24 crc kubenswrapper[4789]: I1216 06:54:24.003097 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:54:24 crc kubenswrapper[4789]: I1216 06:54:24.003188 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:54:24 crc kubenswrapper[4789]: I1216 06:54:24.673969 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lw2sm" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="registry-server" probeResult="failure" output=< Dec 16 06:54:24 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 06:54:24 crc kubenswrapper[4789]: > Dec 16 06:54:25 crc kubenswrapper[4789]: I1216 06:54:25.041683 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p6lz9" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="registry-server" probeResult="failure" output=< Dec 16 06:54:25 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 06:54:25 crc kubenswrapper[4789]: > Dec 16 06:54:29 crc kubenswrapper[4789]: I1216 06:54:29.387011 4789 generic.go:334] "Generic (PLEG): container finished" podID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerID="e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6" exitCode=0 Dec 16 06:54:29 crc kubenswrapper[4789]: I1216 06:54:29.387120 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qphpp" event={"ID":"ffdc85ec-5987-47f0-af71-9896f60cb294","Type":"ContainerDied","Data":"e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6"} Dec 16 06:54:29 crc kubenswrapper[4789]: I1216 06:54:29.390302 4789 generic.go:334] "Generic (PLEG): container finished" podID="706aa82a-2280-4715-919c-a480a2a81f8d" containerID="5dad657c25fa22973bb79c7f939a34692d620e7ee61100b5e98b980a968351f0" exitCode=0 Dec 16 06:54:29 crc kubenswrapper[4789]: I1216 06:54:29.390352 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wdfg" event={"ID":"706aa82a-2280-4715-919c-a480a2a81f8d","Type":"ContainerDied","Data":"5dad657c25fa22973bb79c7f939a34692d620e7ee61100b5e98b980a968351f0"} Dec 16 06:54:31 crc kubenswrapper[4789]: I1216 06:54:31.027401 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:54:31 crc kubenswrapper[4789]: I1216 06:54:31.027754 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:54:32 crc kubenswrapper[4789]: I1216 06:54:32.069230 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dmjv4" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="registry-server" probeResult="failure" output=< Dec 16 06:54:32 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 06:54:32 crc kubenswrapper[4789]: > Dec 16 06:54:33 crc kubenswrapper[4789]: I1216 06:54:33.648824 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:54:33 crc kubenswrapper[4789]: I1216 06:54:33.689269 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:54:34 crc kubenswrapper[4789]: I1216 06:54:34.540776 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:54:34 crc kubenswrapper[4789]: I1216 06:54:34.583743 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:54:35 crc kubenswrapper[4789]: I1216 06:54:35.417998 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qphpp" event={"ID":"ffdc85ec-5987-47f0-af71-9896f60cb294","Type":"ContainerStarted","Data":"1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595"} Dec 16 06:54:35 crc kubenswrapper[4789]: I1216 06:54:35.437099 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qphpp" podStartSLOduration=2.737338001 podStartE2EDuration="55.437082029s" podCreationTimestamp="2025-12-16 06:53:40 +0000 UTC" firstStartedPulling="2025-12-16 06:53:41.940878896 +0000 UTC m=+160.202766525" lastFinishedPulling="2025-12-16 06:54:34.640622924 +0000 UTC m=+212.902510553" observedRunningTime="2025-12-16 06:54:35.435279259 +0000 UTC m=+213.697166888" watchObservedRunningTime="2025-12-16 06:54:35.437082029 +0000 UTC m=+213.698969658" Dec 16 06:54:36 crc kubenswrapper[4789]: I1216 06:54:36.276893 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6lz9"] Dec 16 06:54:36 crc kubenswrapper[4789]: I1216 06:54:36.422488 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6lz9" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="registry-server" containerID="cri-o://adeae7b46c054fdc774e37351351ee37858098ebcb31c6092a857fa3a42d6b41" gracePeriod=2 Dec 16 06:54:38 crc kubenswrapper[4789]: I1216 06:54:38.434549 4789 generic.go:334] "Generic (PLEG): container finished" podID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerID="adeae7b46c054fdc774e37351351ee37858098ebcb31c6092a857fa3a42d6b41" exitCode=0 Dec 16 06:54:38 crc kubenswrapper[4789]: I1216 06:54:38.434588 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6lz9" event={"ID":"27e6916f-d4df-4de1-8781-b7efdc23fff9","Type":"ContainerDied","Data":"adeae7b46c054fdc774e37351351ee37858098ebcb31c6092a857fa3a42d6b41"} Dec 16 06:54:39 crc kubenswrapper[4789]: I1216 06:54:39.441605 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wdfg" event={"ID":"706aa82a-2280-4715-919c-a480a2a81f8d","Type":"ContainerStarted","Data":"000b76426f38d969384bb9190479f5c6282b00345280198570f0dc0333fc4a75"} Dec 16 06:54:39 crc kubenswrapper[4789]: I1216 06:54:39.783259 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:54:39 crc kubenswrapper[4789]: I1216 06:54:39.905060 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-catalog-content\") pod \"27e6916f-d4df-4de1-8781-b7efdc23fff9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " Dec 16 06:54:39 crc kubenswrapper[4789]: I1216 06:54:39.905132 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-utilities\") pod \"27e6916f-d4df-4de1-8781-b7efdc23fff9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " Dec 16 06:54:39 crc kubenswrapper[4789]: I1216 06:54:39.905183 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-489xf\" (UniqueName: \"kubernetes.io/projected/27e6916f-d4df-4de1-8781-b7efdc23fff9-kube-api-access-489xf\") pod \"27e6916f-d4df-4de1-8781-b7efdc23fff9\" (UID: \"27e6916f-d4df-4de1-8781-b7efdc23fff9\") " Dec 16 06:54:39 crc kubenswrapper[4789]: I1216 06:54:39.906175 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-utilities" (OuterVolumeSpecName: "utilities") pod "27e6916f-d4df-4de1-8781-b7efdc23fff9" (UID: "27e6916f-d4df-4de1-8781-b7efdc23fff9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:39 crc kubenswrapper[4789]: I1216 06:54:39.912083 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e6916f-d4df-4de1-8781-b7efdc23fff9-kube-api-access-489xf" (OuterVolumeSpecName: "kube-api-access-489xf") pod "27e6916f-d4df-4de1-8781-b7efdc23fff9" (UID: "27e6916f-d4df-4de1-8781-b7efdc23fff9"). InnerVolumeSpecName "kube-api-access-489xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.006870 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.006935 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-489xf\" (UniqueName: \"kubernetes.io/projected/27e6916f-d4df-4de1-8781-b7efdc23fff9-kube-api-access-489xf\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.448968 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6lz9" event={"ID":"27e6916f-d4df-4de1-8781-b7efdc23fff9","Type":"ContainerDied","Data":"272bc472c64b765c6435ae05a02c66e7caafbe1cd9dee64fc4711815ec9bea28"} Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.449025 4789 scope.go:117] "RemoveContainer" containerID="adeae7b46c054fdc774e37351351ee37858098ebcb31c6092a857fa3a42d6b41" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.449070 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6lz9" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.469245 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5wdfg" podStartSLOduration=5.014476179 podStartE2EDuration="58.469223983s" podCreationTimestamp="2025-12-16 06:53:42 +0000 UTC" firstStartedPulling="2025-12-16 06:53:44.987783565 +0000 UTC m=+163.249671194" lastFinishedPulling="2025-12-16 06:54:38.442531359 +0000 UTC m=+216.704418998" observedRunningTime="2025-12-16 06:54:40.468973027 +0000 UTC m=+218.730860676" watchObservedRunningTime="2025-12-16 06:54:40.469223983 +0000 UTC m=+218.731111612" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.642874 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27e6916f-d4df-4de1-8781-b7efdc23fff9" (UID: "27e6916f-d4df-4de1-8781-b7efdc23fff9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.712029 4789 scope.go:117] "RemoveContainer" containerID="d154c5004ec04a293b925aecd4291d0bfc3df9a06439f9b0d5b141fa4829ce35" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.715109 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e6916f-d4df-4de1-8781-b7efdc23fff9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.787023 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6lz9"] Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.795442 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6lz9"] Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.864604 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.864677 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:54:40 crc kubenswrapper[4789]: I1216 06:54:40.902991 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:54:41 crc kubenswrapper[4789]: I1216 06:54:41.061770 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:54:41 crc kubenswrapper[4789]: I1216 06:54:41.110810 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:54:41 crc kubenswrapper[4789]: I1216 06:54:41.496757 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:54:41 crc kubenswrapper[4789]: I1216 06:54:41.901183 4789 scope.go:117] "RemoveContainer" containerID="a4a2374f1fd1d284bc72262bdcee9eef9879e3f8933b20312ae2bd2cb1afa1c9" Dec 16 06:54:42 crc kubenswrapper[4789]: I1216 06:54:42.114211 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" path="/var/lib/kubelet/pods/27e6916f-d4df-4de1-8781-b7efdc23fff9/volumes" Dec 16 06:54:42 crc kubenswrapper[4789]: I1216 06:54:42.409116 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:54:42 crc kubenswrapper[4789]: I1216 06:54:42.409951 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:54:42 crc kubenswrapper[4789]: I1216 06:54:42.448246 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:54:42 crc kubenswrapper[4789]: I1216 06:54:42.881793 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qphpp"] Dec 16 06:54:43 crc kubenswrapper[4789]: I1216 06:54:43.474870 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qphpp" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="registry-server" containerID="cri-o://1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595" gracePeriod=2 Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.279559 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dmjv4"] Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.280021 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dmjv4" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="registry-server" containerID="cri-o://027b52ddea537705bc074a05e8baeb0cd647251eb8791e9c261e7852ac2d0b97" gracePeriod=2 Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.405648 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.481539 4789 generic.go:334] "Generic (PLEG): container finished" podID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerID="1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595" exitCode=0 Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.481620 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qphpp" event={"ID":"ffdc85ec-5987-47f0-af71-9896f60cb294","Type":"ContainerDied","Data":"1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595"} Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.481682 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qphpp" event={"ID":"ffdc85ec-5987-47f0-af71-9896f60cb294","Type":"ContainerDied","Data":"eaf0c637801ac21fd1e5adf7bc07584d25d0bebe37c4b89bf859602f815a20e2"} Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.481636 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qphpp" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.481705 4789 scope.go:117] "RemoveContainer" containerID="1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.485322 4789 generic.go:334] "Generic (PLEG): container finished" podID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerID="fc9e5eae220cd1b0748708b679a97d51a0ad0270c17fd9b77b42139806cdbf26" exitCode=0 Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.485382 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dg8zr" event={"ID":"f2a44056-0b8f-4209-b92d-cfb1110ba626","Type":"ContainerDied","Data":"fc9e5eae220cd1b0748708b679a97d51a0ad0270c17fd9b77b42139806cdbf26"} Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.507633 4789 generic.go:334] "Generic (PLEG): container finished" podID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerID="027b52ddea537705bc074a05e8baeb0cd647251eb8791e9c261e7852ac2d0b97" exitCode=0 Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.507768 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmjv4" event={"ID":"c18aa314-a66a-4cb6-95ec-d605e999b29f","Type":"ContainerDied","Data":"027b52ddea537705bc074a05e8baeb0cd647251eb8791e9c261e7852ac2d0b97"} Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.511301 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btjcd" event={"ID":"8a620056-2e2e-46ae-9a32-c8aea4b297c4","Type":"ContainerStarted","Data":"8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76"} Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.515441 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7k7" event={"ID":"9af33cc0-7e86-482a-b3a1-89df07600676","Type":"ContainerStarted","Data":"37b071419a4552b47eb17fd41953d763d9960eea7fd05b9e441944d3490406c4"} Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.580891 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-utilities\") pod \"ffdc85ec-5987-47f0-af71-9896f60cb294\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.580954 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-catalog-content\") pod \"ffdc85ec-5987-47f0-af71-9896f60cb294\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.581058 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkl4\" (UniqueName: \"kubernetes.io/projected/ffdc85ec-5987-47f0-af71-9896f60cb294-kube-api-access-wdkl4\") pod \"ffdc85ec-5987-47f0-af71-9896f60cb294\" (UID: \"ffdc85ec-5987-47f0-af71-9896f60cb294\") " Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.581780 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-utilities" (OuterVolumeSpecName: "utilities") pod "ffdc85ec-5987-47f0-af71-9896f60cb294" (UID: "ffdc85ec-5987-47f0-af71-9896f60cb294"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.589106 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffdc85ec-5987-47f0-af71-9896f60cb294-kube-api-access-wdkl4" (OuterVolumeSpecName: "kube-api-access-wdkl4") pod "ffdc85ec-5987-47f0-af71-9896f60cb294" (UID: "ffdc85ec-5987-47f0-af71-9896f60cb294"). InnerVolumeSpecName "kube-api-access-wdkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.627597 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffdc85ec-5987-47f0-af71-9896f60cb294" (UID: "ffdc85ec-5987-47f0-af71-9896f60cb294"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.669537 4789 scope.go:117] "RemoveContainer" containerID="e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.682764 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkl4\" (UniqueName: \"kubernetes.io/projected/ffdc85ec-5987-47f0-af71-9896f60cb294-kube-api-access-wdkl4\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.682790 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.682799 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffdc85ec-5987-47f0-af71-9896f60cb294-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.711718 4789 scope.go:117] "RemoveContainer" containerID="5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.725892 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.731512 4789 scope.go:117] "RemoveContainer" containerID="1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595" Dec 16 06:54:44 crc kubenswrapper[4789]: E1216 06:54:44.731954 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595\": container with ID starting with 1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595 not found: ID does not exist" containerID="1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.731987 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595"} err="failed to get container status \"1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595\": rpc error: code = NotFound desc = could not find container \"1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595\": container with ID starting with 1fb47389fe26d2ebba6b3360686d641e7d6b95cc3c0a2abca26619a8fbc59595 not found: ID does not exist" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.732023 4789 scope.go:117] "RemoveContainer" containerID="e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6" Dec 16 06:54:44 crc kubenswrapper[4789]: E1216 06:54:44.732267 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6\": container with ID starting with e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6 not found: ID does not exist" containerID="e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.732286 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6"} err="failed to get container status \"e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6\": rpc error: code = NotFound desc = could not find container \"e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6\": container with ID starting with e840b65229a6a832a5fbaaac05a87e859a0285d863f3e32fdbaa88f9ec6958d6 not found: ID does not exist" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.732298 4789 scope.go:117] "RemoveContainer" containerID="5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b" Dec 16 06:54:44 crc kubenswrapper[4789]: E1216 06:54:44.732594 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b\": container with ID starting with 5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b not found: ID does not exist" containerID="5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.732612 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b"} err="failed to get container status \"5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b\": rpc error: code = NotFound desc = could not find container \"5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b\": container with ID starting with 5bd0ada90b2857b1c6a95f5331d0d0edaec2d003c250d8e1eded843a9778d76b not found: ID does not exist" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.807537 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qphpp"] Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.811001 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qphpp"] Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.884845 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-catalog-content\") pod \"c18aa314-a66a-4cb6-95ec-d605e999b29f\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.884949 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-utilities\") pod \"c18aa314-a66a-4cb6-95ec-d605e999b29f\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.885075 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kklnz\" (UniqueName: \"kubernetes.io/projected/c18aa314-a66a-4cb6-95ec-d605e999b29f-kube-api-access-kklnz\") pod \"c18aa314-a66a-4cb6-95ec-d605e999b29f\" (UID: \"c18aa314-a66a-4cb6-95ec-d605e999b29f\") " Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.885793 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-utilities" (OuterVolumeSpecName: "utilities") pod "c18aa314-a66a-4cb6-95ec-d605e999b29f" (UID: "c18aa314-a66a-4cb6-95ec-d605e999b29f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.888893 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18aa314-a66a-4cb6-95ec-d605e999b29f-kube-api-access-kklnz" (OuterVolumeSpecName: "kube-api-access-kklnz") pod "c18aa314-a66a-4cb6-95ec-d605e999b29f" (UID: "c18aa314-a66a-4cb6-95ec-d605e999b29f"). InnerVolumeSpecName "kube-api-access-kklnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.929799 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c18aa314-a66a-4cb6-95ec-d605e999b29f" (UID: "c18aa314-a66a-4cb6-95ec-d605e999b29f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.986834 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kklnz\" (UniqueName: \"kubernetes.io/projected/c18aa314-a66a-4cb6-95ec-d605e999b29f-kube-api-access-kklnz\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.986884 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:44 crc kubenswrapper[4789]: I1216 06:54:44.986904 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c18aa314-a66a-4cb6-95ec-d605e999b29f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.525684 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dg8zr" event={"ID":"f2a44056-0b8f-4209-b92d-cfb1110ba626","Type":"ContainerStarted","Data":"52b2072abc42635ee0d71b30dab8895a621403a1bf47fe95545f9a124c4aba88"} Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.528427 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dmjv4" Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.528484 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dmjv4" event={"ID":"c18aa314-a66a-4cb6-95ec-d605e999b29f","Type":"ContainerDied","Data":"aa30df2d66620c98efe965fa1f73f21334ebd90701fa55796c1958efdc5ae769"} Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.528557 4789 scope.go:117] "RemoveContainer" containerID="027b52ddea537705bc074a05e8baeb0cd647251eb8791e9c261e7852ac2d0b97" Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.530662 4789 generic.go:334] "Generic (PLEG): container finished" podID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerID="8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76" exitCode=0 Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.530710 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btjcd" event={"ID":"8a620056-2e2e-46ae-9a32-c8aea4b297c4","Type":"ContainerDied","Data":"8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76"} Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.532899 4789 generic.go:334] "Generic (PLEG): container finished" podID="9af33cc0-7e86-482a-b3a1-89df07600676" containerID="37b071419a4552b47eb17fd41953d763d9960eea7fd05b9e441944d3490406c4" exitCode=0 Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.532984 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7k7" event={"ID":"9af33cc0-7e86-482a-b3a1-89df07600676","Type":"ContainerDied","Data":"37b071419a4552b47eb17fd41953d763d9960eea7fd05b9e441944d3490406c4"} Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.549620 4789 scope.go:117] "RemoveContainer" containerID="fabca29482fe20c154047a05715c0c3c89fdd9766b31627d843a9eeb7b9e67c5" Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.561541 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dg8zr" podStartSLOduration=3.573052332 podStartE2EDuration="1m3.561520071s" podCreationTimestamp="2025-12-16 06:53:42 +0000 UTC" firstStartedPulling="2025-12-16 06:53:44.995529462 +0000 UTC m=+163.257417091" lastFinishedPulling="2025-12-16 06:54:44.983997201 +0000 UTC m=+223.245884830" observedRunningTime="2025-12-16 06:54:45.547544845 +0000 UTC m=+223.809432494" watchObservedRunningTime="2025-12-16 06:54:45.561520071 +0000 UTC m=+223.823407700" Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.562949 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dmjv4"] Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.565843 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dmjv4"] Dec 16 06:54:45 crc kubenswrapper[4789]: I1216 06:54:45.587259 4789 scope.go:117] "RemoveContainer" containerID="c00edfc76110dd75e59e6631c712b3642f561d89a273a76a53259b76ada30a7e" Dec 16 06:54:46 crc kubenswrapper[4789]: I1216 06:54:46.116736 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" path="/var/lib/kubelet/pods/c18aa314-a66a-4cb6-95ec-d605e999b29f/volumes" Dec 16 06:54:46 crc kubenswrapper[4789]: I1216 06:54:46.117434 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" path="/var/lib/kubelet/pods/ffdc85ec-5987-47f0-af71-9896f60cb294/volumes" Dec 16 06:54:46 crc kubenswrapper[4789]: I1216 06:54:46.544513 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btjcd" event={"ID":"8a620056-2e2e-46ae-9a32-c8aea4b297c4","Type":"ContainerStarted","Data":"d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d"} Dec 16 06:54:46 crc kubenswrapper[4789]: I1216 06:54:46.546061 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7k7" event={"ID":"9af33cc0-7e86-482a-b3a1-89df07600676","Type":"ContainerStarted","Data":"9e6722da7678a0a614b638dfbbb4ce49a52d6e2d4325100a7f6e39ce88f0c3f2"} Dec 16 06:54:46 crc kubenswrapper[4789]: I1216 06:54:46.566191 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-btjcd" podStartSLOduration=2.444933464 podStartE2EDuration="1m6.566176954s" podCreationTimestamp="2025-12-16 06:53:40 +0000 UTC" firstStartedPulling="2025-12-16 06:53:41.949546895 +0000 UTC m=+160.211434524" lastFinishedPulling="2025-12-16 06:54:46.070790385 +0000 UTC m=+224.332678014" observedRunningTime="2025-12-16 06:54:46.565150115 +0000 UTC m=+224.827037744" watchObservedRunningTime="2025-12-16 06:54:46.566176954 +0000 UTC m=+224.828064583" Dec 16 06:54:50 crc kubenswrapper[4789]: I1216 06:54:50.490437 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:54:50 crc kubenswrapper[4789]: I1216 06:54:50.490717 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:54:50 crc kubenswrapper[4789]: I1216 06:54:50.548718 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:54:50 crc kubenswrapper[4789]: I1216 06:54:50.576803 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5r7k7" podStartSLOduration=6.519246808 podStartE2EDuration="1m10.576782309s" podCreationTimestamp="2025-12-16 06:53:40 +0000 UTC" firstStartedPulling="2025-12-16 06:53:41.939645638 +0000 UTC m=+160.201533267" lastFinishedPulling="2025-12-16 06:54:45.997181099 +0000 UTC m=+224.259068768" observedRunningTime="2025-12-16 06:54:46.580350056 +0000 UTC m=+224.842237685" watchObservedRunningTime="2025-12-16 06:54:50.576782309 +0000 UTC m=+228.838669938" Dec 16 06:54:50 crc kubenswrapper[4789]: I1216 06:54:50.711804 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:54:50 crc kubenswrapper[4789]: I1216 06:54:50.711877 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:54:50 crc kubenswrapper[4789]: I1216 06:54:50.772672 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.609451 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.906480 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg"] Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.906798 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" podUID="ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" containerName="controller-manager" containerID="cri-o://608a0929d4464dd2aa54974f9a6bba308991697d1203b9d809abfa7468851dd5" gracePeriod=30 Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.927815 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.927882 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.927957 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.928571 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 06:54:51 crc kubenswrapper[4789]: I1216 06:54:51.928638 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6" gracePeriod=600 Dec 16 06:54:52 crc kubenswrapper[4789]: I1216 06:54:52.002778 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt"] Dec 16 06:54:52 crc kubenswrapper[4789]: I1216 06:54:52.003040 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" podUID="67692a1d-1522-4b3c-bd9e-a6151d1ef98c" containerName="route-controller-manager" containerID="cri-o://7456ab684083f1c25507488e0a1c0da6dccfeac630852e327b89636e6e3d8360" gracePeriod=30 Dec 16 06:54:52 crc kubenswrapper[4789]: I1216 06:54:52.444415 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:54:52 crc kubenswrapper[4789]: I1216 06:54:52.826886 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:54:52 crc kubenswrapper[4789]: I1216 06:54:52.826994 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:54:52 crc kubenswrapper[4789]: I1216 06:54:52.875975 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:54:53 crc kubenswrapper[4789]: I1216 06:54:53.583503 4789 generic.go:334] "Generic (PLEG): container finished" podID="67692a1d-1522-4b3c-bd9e-a6151d1ef98c" containerID="7456ab684083f1c25507488e0a1c0da6dccfeac630852e327b89636e6e3d8360" exitCode=0 Dec 16 06:54:53 crc kubenswrapper[4789]: I1216 06:54:53.583593 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" event={"ID":"67692a1d-1522-4b3c-bd9e-a6151d1ef98c","Type":"ContainerDied","Data":"7456ab684083f1c25507488e0a1c0da6dccfeac630852e327b89636e6e3d8360"} Dec 16 06:54:53 crc kubenswrapper[4789]: I1216 06:54:53.586233 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" containerID="608a0929d4464dd2aa54974f9a6bba308991697d1203b9d809abfa7468851dd5" exitCode=0 Dec 16 06:54:53 crc kubenswrapper[4789]: I1216 06:54:53.586327 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" event={"ID":"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c","Type":"ContainerDied","Data":"608a0929d4464dd2aa54974f9a6bba308991697d1203b9d809abfa7468851dd5"} Dec 16 06:54:53 crc kubenswrapper[4789]: I1216 06:54:53.588399 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6" exitCode=0 Dec 16 06:54:53 crc kubenswrapper[4789]: I1216 06:54:53.588438 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6"} Dec 16 06:54:53 crc kubenswrapper[4789]: I1216 06:54:53.621286 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.214677 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.219635 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242419 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9875cff8f-b6vxd"] Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242617 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="extract-utilities" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242629 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="extract-utilities" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242637 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="extract-content" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242643 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="extract-content" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242651 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="extract-content" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242657 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="extract-content" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242664 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242671 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242678 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="extract-utilities" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242685 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="extract-utilities" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242692 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03da8f96-51e6-4bfb-8d36-2d1cb855cf99" containerName="pruner" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242698 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="03da8f96-51e6-4bfb-8d36-2d1cb855cf99" containerName="pruner" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242705 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="extract-content" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242710 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="extract-content" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242720 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242727 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242735 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242740 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242766 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="extract-utilities" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242773 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="extract-utilities" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242783 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67692a1d-1522-4b3c-bd9e-a6151d1ef98c" containerName="route-controller-manager" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242790 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="67692a1d-1522-4b3c-bd9e-a6151d1ef98c" containerName="route-controller-manager" Dec 16 06:54:54 crc kubenswrapper[4789]: E1216 06:54:54.242800 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" containerName="controller-manager" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242805 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" containerName="controller-manager" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242894 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" containerName="controller-manager" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242921 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c18aa314-a66a-4cb6-95ec-d605e999b29f" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242931 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffdc85ec-5987-47f0-af71-9896f60cb294" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242939 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="03da8f96-51e6-4bfb-8d36-2d1cb855cf99" containerName="pruner" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242948 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e6916f-d4df-4de1-8781-b7efdc23fff9" containerName="registry-server" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.242956 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="67692a1d-1522-4b3c-bd9e-a6151d1ef98c" containerName="route-controller-manager" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.243281 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.256668 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9875cff8f-b6vxd"] Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306441 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42mrt\" (UniqueName: \"kubernetes.io/projected/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-kube-api-access-42mrt\") pod \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306511 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-client-ca\") pod \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306546 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-config\") pod \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306566 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-proxy-ca-bundles\") pod \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306587 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-config\") pod \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306602 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv5wm\" (UniqueName: \"kubernetes.io/projected/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-kube-api-access-kv5wm\") pod \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306623 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-serving-cert\") pod \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\" (UID: \"67692a1d-1522-4b3c-bd9e-a6151d1ef98c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306661 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-client-ca\") pod \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.306676 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-serving-cert\") pod \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\" (UID: \"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c\") " Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.307845 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-client-ca" (OuterVolumeSpecName: "client-ca") pod "67692a1d-1522-4b3c-bd9e-a6151d1ef98c" (UID: "67692a1d-1522-4b3c-bd9e-a6151d1ef98c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.307854 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-config" (OuterVolumeSpecName: "config") pod "67692a1d-1522-4b3c-bd9e-a6151d1ef98c" (UID: "67692a1d-1522-4b3c-bd9e-a6151d1ef98c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.308231 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-config" (OuterVolumeSpecName: "config") pod "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" (UID: "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.308512 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" (UID: "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.308735 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-client-ca" (OuterVolumeSpecName: "client-ca") pod "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" (UID: "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.312282 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" (UID: "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.312470 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-kube-api-access-kv5wm" (OuterVolumeSpecName: "kube-api-access-kv5wm") pod "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" (UID: "ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c"). InnerVolumeSpecName "kube-api-access-kv5wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.312566 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-kube-api-access-42mrt" (OuterVolumeSpecName: "kube-api-access-42mrt") pod "67692a1d-1522-4b3c-bd9e-a6151d1ef98c" (UID: "67692a1d-1522-4b3c-bd9e-a6151d1ef98c"). InnerVolumeSpecName "kube-api-access-42mrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.319985 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67692a1d-1522-4b3c-bd9e-a6151d1ef98c" (UID: "67692a1d-1522-4b3c-bd9e-a6151d1ef98c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408079 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-config\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408129 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-client-ca\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408146 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a429877-f45a-4835-b947-f3f97ad199bd-serving-cert\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408178 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbx8s\" (UniqueName: \"kubernetes.io/projected/0a429877-f45a-4835-b947-f3f97ad199bd-kube-api-access-jbx8s\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408226 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-proxy-ca-bundles\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408630 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408662 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408674 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408684 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv5wm\" (UniqueName: \"kubernetes.io/projected/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-kube-api-access-kv5wm\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408695 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408703 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408712 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408720 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42mrt\" (UniqueName: \"kubernetes.io/projected/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-kube-api-access-42mrt\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.408729 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67692a1d-1522-4b3c-bd9e-a6151d1ef98c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.510022 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-proxy-ca-bundles\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.510122 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-config\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.510147 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-client-ca\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.510173 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a429877-f45a-4835-b947-f3f97ad199bd-serving-cert\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.510207 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbx8s\" (UniqueName: \"kubernetes.io/projected/0a429877-f45a-4835-b947-f3f97ad199bd-kube-api-access-jbx8s\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.511500 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-proxy-ca-bundles\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.512104 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-client-ca\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.512941 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-config\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.515568 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a429877-f45a-4835-b947-f3f97ad199bd-serving-cert\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.524785 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbx8s\" (UniqueName: \"kubernetes.io/projected/0a429877-f45a-4835-b947-f3f97ad199bd-kube-api-access-jbx8s\") pod \"controller-manager-9875cff8f-b6vxd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.576393 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.598337 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" event={"ID":"ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c","Type":"ContainerDied","Data":"32ef8b25913818061be06c09711d217be5be34ce777adc18b0ed9d1b983c77e8"} Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.598409 4789 scope.go:117] "RemoveContainer" containerID="608a0929d4464dd2aa54974f9a6bba308991697d1203b9d809abfa7468851dd5" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.598635 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.601487 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"2d12421f385572b5f49ec16ce5dc368fcce4c0b47f4845aad6327275ef658245"} Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.604499 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.605011 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt" event={"ID":"67692a1d-1522-4b3c-bd9e-a6151d1ef98c","Type":"ContainerDied","Data":"d24ae2302f6d5c6049e91e03f2da0ff4e1fae05aa56f89f6dc654e8ff1ba9e88"} Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.617103 4789 scope.go:117] "RemoveContainer" containerID="7456ab684083f1c25507488e0a1c0da6dccfeac630852e327b89636e6e3d8360" Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.644309 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg"] Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.660049 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-676b6cc5d8-qvtwg"] Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.673054 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt"] Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.684230 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b5f8b848-r8jgt"] Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.760642 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-s4s66"] Dec 16 06:54:54 crc kubenswrapper[4789]: I1216 06:54:54.886894 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9875cff8f-b6vxd"] Dec 16 06:54:54 crc kubenswrapper[4789]: W1216 06:54:54.892480 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a429877_f45a_4835_b947_f3f97ad199bd.slice/crio-28b5b172cf0975cc285190012c68de3b74f7871cc7107078fac69b8b18423573 WatchSource:0}: Error finding container 28b5b172cf0975cc285190012c68de3b74f7871cc7107078fac69b8b18423573: Status 404 returned error can't find the container with id 28b5b172cf0975cc285190012c68de3b74f7871cc7107078fac69b8b18423573 Dec 16 06:54:55 crc kubenswrapper[4789]: I1216 06:54:55.276381 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dg8zr"] Dec 16 06:54:55 crc kubenswrapper[4789]: I1216 06:54:55.610852 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" event={"ID":"0a429877-f45a-4835-b947-f3f97ad199bd","Type":"ContainerStarted","Data":"91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d"} Dec 16 06:54:55 crc kubenswrapper[4789]: I1216 06:54:55.610904 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" event={"ID":"0a429877-f45a-4835-b947-f3f97ad199bd","Type":"ContainerStarted","Data":"28b5b172cf0975cc285190012c68de3b74f7871cc7107078fac69b8b18423573"} Dec 16 06:54:55 crc kubenswrapper[4789]: I1216 06:54:55.611934 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:55 crc kubenswrapper[4789]: I1216 06:54:55.613273 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dg8zr" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="registry-server" containerID="cri-o://52b2072abc42635ee0d71b30dab8895a621403a1bf47fe95545f9a124c4aba88" gracePeriod=2 Dec 16 06:54:55 crc kubenswrapper[4789]: I1216 06:54:55.618643 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:54:55 crc kubenswrapper[4789]: I1216 06:54:55.631850 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" podStartSLOduration=4.631831878 podStartE2EDuration="4.631831878s" podCreationTimestamp="2025-12-16 06:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:54:55.631517259 +0000 UTC m=+233.893404888" watchObservedRunningTime="2025-12-16 06:54:55.631831878 +0000 UTC m=+233.893719507" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.112023 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67692a1d-1522-4b3c-bd9e-a6151d1ef98c" path="/var/lib/kubelet/pods/67692a1d-1522-4b3c-bd9e-a6151d1ef98c/volumes" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.113356 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c" path="/var/lib/kubelet/pods/ca9c7707-0f10-4a53-b2f2-d8cf7c37ab6c/volumes" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.315766 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc"] Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.316433 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.318650 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.318787 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.318803 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.319704 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.320751 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.323354 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc"] Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.324472 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.439844 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-serving-cert\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.440272 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-client-ca\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.440319 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pnb4\" (UniqueName: \"kubernetes.io/projected/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-kube-api-access-9pnb4\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.440434 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-config\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.542155 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-client-ca\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.542204 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pnb4\" (UniqueName: \"kubernetes.io/projected/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-kube-api-access-9pnb4\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.542258 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-config\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.542317 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-serving-cert\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.543227 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-client-ca\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.543565 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-config\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.549535 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-serving-cert\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.562370 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pnb4\" (UniqueName: \"kubernetes.io/projected/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-kube-api-access-9pnb4\") pod \"route-controller-manager-7487d8d6cb-622sc\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.643313 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:56 crc kubenswrapper[4789]: I1216 06:54:56.878108 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc"] Dec 16 06:54:56 crc kubenswrapper[4789]: W1216 06:54:56.885364 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed4c5b6_098f_4f9d_b890_1ba5cb82e4b6.slice/crio-e941479964fd25469e9697a2d0e9d50f4b482c40c1d9c800b60d80a1e8c578ed WatchSource:0}: Error finding container e941479964fd25469e9697a2d0e9d50f4b482c40c1d9c800b60d80a1e8c578ed: Status 404 returned error can't find the container with id e941479964fd25469e9697a2d0e9d50f4b482c40c1d9c800b60d80a1e8c578ed Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.625198 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" event={"ID":"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6","Type":"ContainerStarted","Data":"e941479964fd25469e9697a2d0e9d50f4b482c40c1d9c800b60d80a1e8c578ed"} Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.989270 4789 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.990605 4789 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.990943 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2" gracePeriod=15 Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.991193 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.991664 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41" gracePeriod=15 Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.991774 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94" gracePeriod=15 Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.991822 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511" gracePeriod=15 Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.991788 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4" gracePeriod=15 Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.992535 4789 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:54:57 crc kubenswrapper[4789]: E1216 06:54:57.992964 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.992990 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:54:57 crc kubenswrapper[4789]: E1216 06:54:57.993010 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993022 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:54:57 crc kubenswrapper[4789]: E1216 06:54:57.993039 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993047 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 06:54:57 crc kubenswrapper[4789]: E1216 06:54:57.993057 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993065 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 06:54:57 crc kubenswrapper[4789]: E1216 06:54:57.993076 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993094 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 06:54:57 crc kubenswrapper[4789]: E1216 06:54:57.993110 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993118 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 06:54:57 crc kubenswrapper[4789]: E1216 06:54:57.993134 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993144 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993351 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993369 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993380 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993391 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993408 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 06:54:57 crc kubenswrapper[4789]: I1216 06:54:57.993622 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168049 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168134 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168169 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168187 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168215 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168242 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168266 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.168291 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.269616 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.269679 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.269815 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.269892 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.269956 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.269977 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270050 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270112 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270146 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270215 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270863 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270901 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270948 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270970 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.270966 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.271009 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.632683 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" event={"ID":"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6","Type":"ContainerStarted","Data":"0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f"} Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.632974 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.633659 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.637437 4789 generic.go:334] "Generic (PLEG): container finished" podID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerID="52b2072abc42635ee0d71b30dab8895a621403a1bf47fe95545f9a124c4aba88" exitCode=0 Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.637535 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dg8zr" event={"ID":"f2a44056-0b8f-4209-b92d-cfb1110ba626","Type":"ContainerDied","Data":"52b2072abc42635ee0d71b30dab8895a621403a1bf47fe95545f9a124c4aba88"} Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.644598 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.646246 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.647157 4789 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41" exitCode=0 Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.647189 4789 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4" exitCode=0 Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.647200 4789 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511" exitCode=0 Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.647210 4789 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94" exitCode=2 Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.647432 4789 scope.go:117] "RemoveContainer" containerID="a42e7cdb015161cf3ded5c20d5afdb040f3745a254c1bb8384ba9ef7b761ffb6" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.659015 4789 generic.go:334] "Generic (PLEG): container finished" podID="e8ad11d7-1675-46d0-9e72-10f68b56a823" containerID="85986a76a7d20ecb3d62797a0545ca75cb75644974cddf74c146bb56e8260ab1" exitCode=0 Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.659057 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8ad11d7-1675-46d0-9e72-10f68b56a823","Type":"ContainerDied","Data":"85986a76a7d20ecb3d62797a0545ca75cb75644974cddf74c146bb56e8260ab1"} Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.659586 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.659744 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.879993 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.880836 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.881108 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.881378 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.979497 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-utilities\") pod \"f2a44056-0b8f-4209-b92d-cfb1110ba626\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.979630 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-catalog-content\") pod \"f2a44056-0b8f-4209-b92d-cfb1110ba626\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.979677 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdxrx\" (UniqueName: \"kubernetes.io/projected/f2a44056-0b8f-4209-b92d-cfb1110ba626-kube-api-access-fdxrx\") pod \"f2a44056-0b8f-4209-b92d-cfb1110ba626\" (UID: \"f2a44056-0b8f-4209-b92d-cfb1110ba626\") " Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.981537 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-utilities" (OuterVolumeSpecName: "utilities") pod "f2a44056-0b8f-4209-b92d-cfb1110ba626" (UID: "f2a44056-0b8f-4209-b92d-cfb1110ba626"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:58 crc kubenswrapper[4789]: I1216 06:54:58.985989 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a44056-0b8f-4209-b92d-cfb1110ba626-kube-api-access-fdxrx" (OuterVolumeSpecName: "kube-api-access-fdxrx") pod "f2a44056-0b8f-4209-b92d-cfb1110ba626" (UID: "f2a44056-0b8f-4209-b92d-cfb1110ba626"). InnerVolumeSpecName "kube-api-access-fdxrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.013510 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2a44056-0b8f-4209-b92d-cfb1110ba626" (UID: "f2a44056-0b8f-4209-b92d-cfb1110ba626"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.081706 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.081761 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2a44056-0b8f-4209-b92d-cfb1110ba626-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.081777 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdxrx\" (UniqueName: \"kubernetes.io/projected/f2a44056-0b8f-4209-b92d-cfb1110ba626-kube-api-access-fdxrx\") on node \"crc\" DevicePath \"\"" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.633860 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.634029 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 06:54:59 crc kubenswrapper[4789]: E1216 06:54:59.634652 4789 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event=< Dec 16 06:54:59 crc kubenswrapper[4789]: &Event{ObjectMeta:{route-controller-manager-7487d8d6cb-622sc.18819fae3ee07997 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-7487d8d6cb-622sc,UID:7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6,APIVersion:v1,ResourceVersion:29678,FieldPath:spec.containers{route-controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.60:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Dec 16 06:54:59 crc kubenswrapper[4789]: body: Dec 16 06:54:59 crc kubenswrapper[4789]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 06:54:59.633969559 +0000 UTC m=+237.895857198,LastTimestamp:2025-12-16 06:54:59.633969559 +0000 UTC m=+237.895857198,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 16 06:54:59 crc kubenswrapper[4789]: > Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.665811 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dg8zr" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.665943 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dg8zr" event={"ID":"f2a44056-0b8f-4209-b92d-cfb1110ba626","Type":"ContainerDied","Data":"dc1364c001d4d2c7d6fabfab30957f40abe27ec7e2f16afcca5d55f700b07ab7"} Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.665995 4789 scope.go:117] "RemoveContainer" containerID="52b2072abc42635ee0d71b30dab8895a621403a1bf47fe95545f9a124c4aba88" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.666592 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.666809 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.666993 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.668413 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.681525 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.681756 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.682004 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.685365 4789 scope.go:117] "RemoveContainer" containerID="fc9e5eae220cd1b0748708b679a97d51a0ad0270c17fd9b77b42139806cdbf26" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.707255 4789 scope.go:117] "RemoveContainer" containerID="72e6f82fa8014408b7e6f722af7f3c1036333cc4581c46a769b6a7eff65e09be" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.932407 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.933176 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.933413 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:54:59 crc kubenswrapper[4789]: I1216 06:54:59.933655 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.096190 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-kubelet-dir\") pod \"e8ad11d7-1675-46d0-9e72-10f68b56a823\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.096252 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-var-lock\") pod \"e8ad11d7-1675-46d0-9e72-10f68b56a823\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.096305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8ad11d7-1675-46d0-9e72-10f68b56a823-kube-api-access\") pod \"e8ad11d7-1675-46d0-9e72-10f68b56a823\" (UID: \"e8ad11d7-1675-46d0-9e72-10f68b56a823\") " Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.101554 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e8ad11d7-1675-46d0-9e72-10f68b56a823" (UID: "e8ad11d7-1675-46d0-9e72-10f68b56a823"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.109809 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-var-lock" (OuterVolumeSpecName: "var-lock") pod "e8ad11d7-1675-46d0-9e72-10f68b56a823" (UID: "e8ad11d7-1675-46d0-9e72-10f68b56a823"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.109980 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ad11d7-1675-46d0-9e72-10f68b56a823-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e8ad11d7-1675-46d0-9e72-10f68b56a823" (UID: "e8ad11d7-1675-46d0-9e72-10f68b56a823"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.197208 4789 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.197235 4789 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8ad11d7-1675-46d0-9e72-10f68b56a823-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.197246 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8ad11d7-1675-46d0-9e72-10f68b56a823-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.534764 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.535333 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.535728 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.536031 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.536306 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.670087 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.670200 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.679081 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.680010 4789 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2" exitCode=0 Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.681404 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e8ad11d7-1675-46d0-9e72-10f68b56a823","Type":"ContainerDied","Data":"93fceea3a4444d010bb488159be0c9e74044aa0699d68e46b4028684afd2c7ca"} Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.681425 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.681438 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93fceea3a4444d010bb488159be0c9e74044aa0699d68e46b4028684afd2c7ca" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.685100 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.685621 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.686140 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.686456 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.912132 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.912859 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.913324 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.913590 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.913931 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.914178 4789 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:00 crc kubenswrapper[4789]: I1216 06:55:00.914476 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.007771 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.007873 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.007881 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.007899 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.007980 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.008376 4789 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.008406 4789 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.008385 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.110178 4789 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.693960 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.695382 4789 scope.go:117] "RemoveContainer" containerID="befb70dd2506f577fa74e4d3640acc6340f477ffc2c45d666d12acc8e0c0ea41" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.695481 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.708275 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.709046 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.709431 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.710091 4789 scope.go:117] "RemoveContainer" containerID="15b92e18b18c1f526bf293b72439d3b51cb34fadc391880f9a992f8cba4f62d4" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.710505 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.711381 4789 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.723044 4789 scope.go:117] "RemoveContainer" containerID="4a76509739badeddbd3fe1b0468ac137cb3ce5f0b75a073de3648ab52f1d5511" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.736390 4789 scope.go:117] "RemoveContainer" containerID="89c960b2ca0360a3ce515edf82d9f692f1894f4c9e7f0e3144117a2e708bbe94" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.748015 4789 scope.go:117] "RemoveContainer" containerID="bf2e674162b4c77eb39f6b175e021a8cc0301c482bf4caaeab9e533213387cf2" Dec 16 06:55:01 crc kubenswrapper[4789]: I1216 06:55:01.763230 4789 scope.go:117] "RemoveContainer" containerID="898ef1e2db974697b230e559499c07790f4fffaa1860a79f403a56e2f32c2a24" Dec 16 06:55:02 crc kubenswrapper[4789]: I1216 06:55:02.106753 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:02 crc kubenswrapper[4789]: I1216 06:55:02.106984 4789 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:02 crc kubenswrapper[4789]: I1216 06:55:02.107365 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:02 crc kubenswrapper[4789]: I1216 06:55:02.107585 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:02 crc kubenswrapper[4789]: I1216 06:55:02.107772 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:02 crc kubenswrapper[4789]: I1216 06:55:02.111935 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 16 06:55:03 crc kubenswrapper[4789]: E1216 06:55:03.031950 4789 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:55:03 crc kubenswrapper[4789]: I1216 06:55:03.032666 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:55:03 crc kubenswrapper[4789]: W1216 06:55:03.048770 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c7b68a225c8229e85b992117612da4682a9617bd8284fd1b0bd24b7b9970315a WatchSource:0}: Error finding container c7b68a225c8229e85b992117612da4682a9617bd8284fd1b0bd24b7b9970315a: Status 404 returned error can't find the container with id c7b68a225c8229e85b992117612da4682a9617bd8284fd1b0bd24b7b9970315a Dec 16 06:55:03 crc kubenswrapper[4789]: I1216 06:55:03.704543 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c7b68a225c8229e85b992117612da4682a9617bd8284fd1b0bd24b7b9970315a"} Dec 16 06:55:04 crc kubenswrapper[4789]: I1216 06:55:04.712743 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1aada94693d2db0ef2ec52a0dc76a3d57ca7a5c3b5b18da14431aa5db07dbb6e"} Dec 16 06:55:04 crc kubenswrapper[4789]: E1216 06:55:04.713731 4789 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:55:04 crc kubenswrapper[4789]: I1216 06:55:04.714315 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:04 crc kubenswrapper[4789]: I1216 06:55:04.714755 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:04 crc kubenswrapper[4789]: I1216 06:55:04.715018 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:04 crc kubenswrapper[4789]: I1216 06:55:04.715205 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:05 crc kubenswrapper[4789]: E1216 06:55:05.719411 4789 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:55:06 crc kubenswrapper[4789]: E1216 06:55:06.486190 4789 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event=< Dec 16 06:55:06 crc kubenswrapper[4789]: &Event{ObjectMeta:{route-controller-manager-7487d8d6cb-622sc.18819fae3ee07997 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-7487d8d6cb-622sc,UID:7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6,APIVersion:v1,ResourceVersion:29678,FieldPath:spec.containers{route-controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.60:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Dec 16 06:55:06 crc kubenswrapper[4789]: body: Dec 16 06:55:06 crc kubenswrapper[4789]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 06:54:59.633969559 +0000 UTC m=+237.895857198,LastTimestamp:2025-12-16 06:54:59.633969559 +0000 UTC m=+237.895857198,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 16 06:55:06 crc kubenswrapper[4789]: > Dec 16 06:55:07 crc kubenswrapper[4789]: I1216 06:55:07.644331 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:07 crc kubenswrapper[4789]: I1216 06:55:07.644409 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:07 crc kubenswrapper[4789]: E1216 06:55:07.801997 4789 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:07 crc kubenswrapper[4789]: E1216 06:55:07.803074 4789 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:07 crc kubenswrapper[4789]: E1216 06:55:07.803413 4789 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:07 crc kubenswrapper[4789]: E1216 06:55:07.803677 4789 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:07 crc kubenswrapper[4789]: E1216 06:55:07.803950 4789 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:07 crc kubenswrapper[4789]: I1216 06:55:07.803982 4789 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 16 06:55:07 crc kubenswrapper[4789]: E1216 06:55:07.804248 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="200ms" Dec 16 06:55:08 crc kubenswrapper[4789]: E1216 06:55:08.005959 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="400ms" Dec 16 06:55:08 crc kubenswrapper[4789]: E1216 06:55:08.121313 4789 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" volumeName="registry-storage" Dec 16 06:55:08 crc kubenswrapper[4789]: E1216 06:55:08.406642 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="800ms" Dec 16 06:55:09 crc kubenswrapper[4789]: E1216 06:55:09.207759 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="1.6s" Dec 16 06:55:10 crc kubenswrapper[4789]: E1216 06:55:10.809348 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="3.2s" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.107270 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.108076 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.108336 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.108617 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.755985 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.756071 4789 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2" exitCode=1 Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.756115 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2"} Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.756842 4789 scope.go:117] "RemoveContainer" containerID="09b765145e25b0397d2a712358fe15b38f447de9ac3fa312d239a86c83cc29b2" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.757220 4789 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.757751 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.758137 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.758594 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:12 crc kubenswrapper[4789]: I1216 06:55:12.759067 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.104348 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.106200 4789 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.106777 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.107352 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.107801 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.108184 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.118134 4789 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.118192 4789 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:13 crc kubenswrapper[4789]: E1216 06:55:13.118849 4789 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.119525 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:13 crc kubenswrapper[4789]: W1216 06:55:13.151638 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-1ce3fad664b2d87e0a8c4a26cd19b1d678d7a7b6384c5510ab4c8770edae8b27 WatchSource:0}: Error finding container 1ce3fad664b2d87e0a8c4a26cd19b1d678d7a7b6384c5510ab4c8770edae8b27: Status 404 returned error can't find the container with id 1ce3fad664b2d87e0a8c4a26cd19b1d678d7a7b6384c5510ab4c8770edae8b27 Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.766562 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.767399 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0f3ce134431fdec99ec0effc3f87beb97f315631bd1ef560a919045fc808087"} Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.768401 4789 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.768801 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.769037 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.769251 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.769451 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.769749 4789 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e4e36d33ace999270bf041696c9ed5f6cff8bba952991ca512e7df8db65a7a92" exitCode=0 Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.769827 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e4e36d33ace999270bf041696c9ed5f6cff8bba952991ca512e7df8db65a7a92"} Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.769898 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1ce3fad664b2d87e0a8c4a26cd19b1d678d7a7b6384c5510ab4c8770edae8b27"} Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.770290 4789 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.770310 4789 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.770548 4789 status_manager.go:851] "Failed to get status for pod" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: E1216 06:55:13.770644 4789 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.770826 4789 status_manager.go:851] "Failed to get status for pod" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" pod="openshift-marketplace/certified-operators-5r7k7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-5r7k7\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.771092 4789 status_manager.go:851] "Failed to get status for pod" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-7487d8d6cb-622sc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.771402 4789 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:13 crc kubenswrapper[4789]: I1216 06:55:13.771624 4789 status_manager.go:851] "Failed to get status for pod" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" pod="openshift-marketplace/redhat-marketplace-dg8zr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dg8zr\": dial tcp 38.102.83.46:6443: connect: connection refused" Dec 16 06:55:14 crc kubenswrapper[4789]: E1216 06:55:14.011419 4789 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="6.4s" Dec 16 06:55:14 crc kubenswrapper[4789]: I1216 06:55:14.786424 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"15de1da92c3215ed8a5b9d746cccae6ad9a953b1103ecfa0428a8529a8ff382f"} Dec 16 06:55:14 crc kubenswrapper[4789]: I1216 06:55:14.786750 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c458bb519af021b35da60076423b47abc4d137c6b4a4d10add0e35c69bb75bb6"} Dec 16 06:55:14 crc kubenswrapper[4789]: I1216 06:55:14.786761 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"869b6ddd0f98faf069fe7caf29d7e8cea6dc238ef006e1beb7a499664e837271"} Dec 16 06:55:14 crc kubenswrapper[4789]: I1216 06:55:14.786770 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7ab53c95e30ce92613977733011c3f447a0e2c1c88b5f3524b5704923924879f"} Dec 16 06:55:15 crc kubenswrapper[4789]: I1216 06:55:15.513058 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:15 crc kubenswrapper[4789]: I1216 06:55:15.794770 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"751eefdbf3e6b59b5529f67567d72c4d6046ef1c49dd609f5ca3e0512721ca88"} Dec 16 06:55:15 crc kubenswrapper[4789]: I1216 06:55:15.794968 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:15 crc kubenswrapper[4789]: I1216 06:55:15.795287 4789 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:15 crc kubenswrapper[4789]: I1216 06:55:15.795320 4789 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:16 crc kubenswrapper[4789]: I1216 06:55:16.988223 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:16 crc kubenswrapper[4789]: I1216 06:55:16.988361 4789 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 06:55:16 crc kubenswrapper[4789]: I1216 06:55:16.988413 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 06:55:17 crc kubenswrapper[4789]: I1216 06:55:17.645072 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:17 crc kubenswrapper[4789]: I1216 06:55:17.645125 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:18 crc kubenswrapper[4789]: I1216 06:55:18.119958 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:18 crc kubenswrapper[4789]: I1216 06:55:18.120013 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:18 crc kubenswrapper[4789]: I1216 06:55:18.126696 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:19 crc kubenswrapper[4789]: I1216 06:55:19.832234 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" podUID="32383d71-3226-46ea-9d69-c3ab1096ec2c" containerName="oauth-openshift" containerID="cri-o://9e8eac2b1e7f2c1ddbb9be410ad8532c433010e87cb5b6050e74cd1fe2eacaa8" gracePeriod=15 Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.806074 4789 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.822893 4789 generic.go:334] "Generic (PLEG): container finished" podID="32383d71-3226-46ea-9d69-c3ab1096ec2c" containerID="9e8eac2b1e7f2c1ddbb9be410ad8532c433010e87cb5b6050e74cd1fe2eacaa8" exitCode=0 Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.822944 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" event={"ID":"32383d71-3226-46ea-9d69-c3ab1096ec2c","Type":"ContainerDied","Data":"9e8eac2b1e7f2c1ddbb9be410ad8532c433010e87cb5b6050e74cd1fe2eacaa8"} Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.822967 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" event={"ID":"32383d71-3226-46ea-9d69-c3ab1096ec2c","Type":"ContainerDied","Data":"f3d6929635ed92e90ef544b574f3b1d74abea5ac59cf6ac483748f3278930807"} Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.822980 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3d6929635ed92e90ef544b574f3b1d74abea5ac59cf6ac483748f3278930807" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.832035 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.959862 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-serving-cert\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.959906 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-trusted-ca-bundle\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.959965 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-policies\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.959997 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-session\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960019 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-error\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960035 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-ocp-branding-template\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960056 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-provider-selection\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960076 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-idp-0-file-data\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960104 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-login\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960120 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-dir\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960140 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-service-ca\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960187 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-router-certs\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960213 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfml8\" (UniqueName: \"kubernetes.io/projected/32383d71-3226-46ea-9d69-c3ab1096ec2c-kube-api-access-tfml8\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960240 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-cliconfig\") pod \"32383d71-3226-46ea-9d69-c3ab1096ec2c\" (UID: \"32383d71-3226-46ea-9d69-c3ab1096ec2c\") " Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.960816 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.961299 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.961349 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.961665 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.961828 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.965640 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.966070 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.966261 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32383d71-3226-46ea-9d69-c3ab1096ec2c-kube-api-access-tfml8" (OuterVolumeSpecName: "kube-api-access-tfml8") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "kube-api-access-tfml8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.966397 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.966646 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.966828 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.967312 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.967390 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:20 crc kubenswrapper[4789]: I1216 06:55:20.967537 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "32383d71-3226-46ea-9d69-c3ab1096ec2c" (UID: "32383d71-3226-46ea-9d69-c3ab1096ec2c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061608 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061655 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061670 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061685 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061697 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061710 4789 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061722 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061735 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061748 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfml8\" (UniqueName: \"kubernetes.io/projected/32383d71-3226-46ea-9d69-c3ab1096ec2c-kube-api-access-tfml8\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061761 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061773 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061785 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061798 4789 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/32383d71-3226-46ea-9d69-c3ab1096ec2c-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.061809 4789 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/32383d71-3226-46ea-9d69-c3ab1096ec2c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.827133 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-s4s66" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.827791 4789 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.827818 4789 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:21 crc kubenswrapper[4789]: I1216 06:55:21.831638 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:21 crc kubenswrapper[4789]: E1216 06:55:21.952277 4789 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Dec 16 06:55:22 crc kubenswrapper[4789]: I1216 06:55:22.126957 4789 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="971b9126-27b1-480a-869c-921bc1569f74" Dec 16 06:55:22 crc kubenswrapper[4789]: I1216 06:55:22.837581 4789 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:22 crc kubenswrapper[4789]: I1216 06:55:22.837896 4789 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6fad8a2-3742-469d-be15-46a42233af5b" Dec 16 06:55:22 crc kubenswrapper[4789]: I1216 06:55:22.841468 4789 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="971b9126-27b1-480a-869c-921bc1569f74" Dec 16 06:55:26 crc kubenswrapper[4789]: I1216 06:55:26.988446 4789 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 06:55:26 crc kubenswrapper[4789]: I1216 06:55:26.988538 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 06:55:27 crc kubenswrapper[4789]: I1216 06:55:27.644617 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:27 crc kubenswrapper[4789]: I1216 06:55:27.644689 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:28 crc kubenswrapper[4789]: I1216 06:55:28.867439 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7487d8d6cb-622sc_7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6/route-controller-manager/0.log" Dec 16 06:55:28 crc kubenswrapper[4789]: I1216 06:55:28.868676 4789 generic.go:334] "Generic (PLEG): container finished" podID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerID="0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f" exitCode=255 Dec 16 06:55:28 crc kubenswrapper[4789]: I1216 06:55:28.868719 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" event={"ID":"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6","Type":"ContainerDied","Data":"0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f"} Dec 16 06:55:28 crc kubenswrapper[4789]: I1216 06:55:28.869315 4789 scope.go:117] "RemoveContainer" containerID="0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f" Dec 16 06:55:29 crc kubenswrapper[4789]: I1216 06:55:29.874882 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7487d8d6cb-622sc_7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6/route-controller-manager/0.log" Dec 16 06:55:29 crc kubenswrapper[4789]: I1216 06:55:29.875148 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" event={"ID":"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6","Type":"ContainerStarted","Data":"27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350"} Dec 16 06:55:29 crc kubenswrapper[4789]: I1216 06:55:29.875470 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:55:29 crc kubenswrapper[4789]: I1216 06:55:29.916649 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 06:55:30 crc kubenswrapper[4789]: I1216 06:55:30.422503 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 06:55:30 crc kubenswrapper[4789]: I1216 06:55:30.574571 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 06:55:30 crc kubenswrapper[4789]: I1216 06:55:30.875928 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:30 crc kubenswrapper[4789]: I1216 06:55:30.876002 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:30 crc kubenswrapper[4789]: I1216 06:55:30.911017 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 06:55:30 crc kubenswrapper[4789]: I1216 06:55:30.925929 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 06:55:30 crc kubenswrapper[4789]: I1216 06:55:30.991960 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 06:55:31 crc kubenswrapper[4789]: I1216 06:55:31.334792 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 06:55:31 crc kubenswrapper[4789]: I1216 06:55:31.409073 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 06:55:31 crc kubenswrapper[4789]: I1216 06:55:31.443779 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 06:55:31 crc kubenswrapper[4789]: I1216 06:55:31.715962 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:55:31 crc kubenswrapper[4789]: I1216 06:55:31.841019 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 06:55:31 crc kubenswrapper[4789]: I1216 06:55:31.881676 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:31 crc kubenswrapper[4789]: I1216 06:55:31.881737 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:32 crc kubenswrapper[4789]: I1216 06:55:32.017204 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 06:55:32 crc kubenswrapper[4789]: I1216 06:55:32.166150 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:55:32 crc kubenswrapper[4789]: I1216 06:55:32.510530 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 06:55:32 crc kubenswrapper[4789]: I1216 06:55:32.727799 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 06:55:32 crc kubenswrapper[4789]: I1216 06:55:32.848005 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 06:55:32 crc kubenswrapper[4789]: I1216 06:55:32.879587 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.018668 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.028699 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.361768 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.370100 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.432256 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.568084 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.675340 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.737101 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.758047 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.762566 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.827259 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.949326 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 06:55:33 crc kubenswrapper[4789]: I1216 06:55:33.953304 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.013959 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.045171 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.204393 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.359854 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.369779 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.380759 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.615062 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.641752 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.652792 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.673733 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.737864 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 06:55:34 crc kubenswrapper[4789]: I1216 06:55:34.950028 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.035472 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.042971 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.068593 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.081476 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.089712 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.163365 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.282504 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.343922 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.360276 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.478262 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.622286 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.685872 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.744030 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.777456 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.963540 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 06:55:35 crc kubenswrapper[4789]: I1216 06:55:35.978418 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.100481 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.110808 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.159180 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.200171 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.223328 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.231083 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.256076 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.284744 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.366775 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.476386 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.511338 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.514282 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.625715 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.632473 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.655600 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.688033 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.738054 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.799853 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.850987 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.946956 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.970686 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.991670 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:36 crc kubenswrapper[4789]: I1216 06:55:36.996237 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.002226 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.018094 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.107293 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.166279 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.195085 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.241234 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.241406 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.258219 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.288654 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.303670 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.333687 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.473815 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.644278 4789 patch_prober.go:28] interesting pod/route-controller-manager-7487d8d6cb-622sc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.644354 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.704164 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.733836 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.812345 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.841468 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 06:55:37 crc kubenswrapper[4789]: I1216 06:55:37.961855 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.053812 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.058833 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.066650 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.102715 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.128979 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.191988 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.287987 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.346047 4789 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.419559 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.519639 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.526901 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.546412 4789 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.549520 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podStartSLOduration=46.549494943 podStartE2EDuration="46.549494943s" podCreationTimestamp="2025-12-16 06:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:55:20.907305875 +0000 UTC m=+259.169193504" watchObservedRunningTime="2025-12-16 06:55:38.549494943 +0000 UTC m=+276.811382612" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.554412 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-s4s66","openshift-marketplace/redhat-marketplace-dg8zr"] Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.554498 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.562517 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.573956 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.573936283 podStartE2EDuration="18.573936283s" podCreationTimestamp="2025-12-16 06:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:55:38.573251797 +0000 UTC m=+276.835139496" watchObservedRunningTime="2025-12-16 06:55:38.573936283 +0000 UTC m=+276.835823912" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.649267 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.707808 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.730588 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.758315 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.840967 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.846584 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.939054 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.941156 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.961195 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 06:55:38 crc kubenswrapper[4789]: I1216 06:55:38.979182 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.005837 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.011907 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.054706 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.120296 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.222356 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.257507 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.307014 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.356026 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.448468 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.479020 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.548899 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.573325 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.585860 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.643487 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.685561 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.712491 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.726148 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.761970 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.763741 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.799008 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.936825 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.947865 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 06:55:39 crc kubenswrapper[4789]: I1216 06:55:39.948553 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.016216 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.036346 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.110350 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32383d71-3226-46ea-9d69-c3ab1096ec2c" path="/var/lib/kubelet/pods/32383d71-3226-46ea-9d69-c3ab1096ec2c/volumes" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.111203 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" path="/var/lib/kubelet/pods/f2a44056-0b8f-4209-b92d-cfb1110ba626/volumes" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.144637 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.166740 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.175876 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.186063 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.296429 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.317594 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.407331 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.444053 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.582844 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.583177 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.727841 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.736500 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 06:55:40 crc kubenswrapper[4789]: I1216 06:55:40.842635 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.036813 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.051250 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.101033 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.179263 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.196214 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.208337 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.210373 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.261744 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.268757 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.308767 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.387705 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.449722 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.466868 4789 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.474700 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.578619 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.725175 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.752647 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.853207 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.896148 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 06:55:41 crc kubenswrapper[4789]: I1216 06:55:41.897397 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.002903 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.124818 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.141608 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.236443 4789 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.413116 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.425528 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.503624 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.507027 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.611979 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.771260 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.801116 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.877607 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.939149 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 06:55:42 crc kubenswrapper[4789]: I1216 06:55:42.941508 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 06:55:43 crc kubenswrapper[4789]: I1216 06:55:43.141881 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 06:55:43 crc kubenswrapper[4789]: I1216 06:55:43.197095 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 06:55:43 crc kubenswrapper[4789]: I1216 06:55:43.353558 4789 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.354143 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1aada94693d2db0ef2ec52a0dc76a3d57ca7a5c3b5b18da14431aa5db07dbb6e" gracePeriod=5 Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.376493 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.388572 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.425414 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.453498 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.561484 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.585904 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.811700 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.899140 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.917144 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:43.960236 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.097241 4789 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.149271 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.247625 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.266838 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.279970 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.301315 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.301340 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.311596 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.502324 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.678244 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 06:55:44 crc kubenswrapper[4789]: I1216 06:55:44.762713 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.104969 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.227232 4789 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.268715 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.321394 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.347634 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5fd4b96696-zfw92"] Dec 16 06:55:45 crc kubenswrapper[4789]: E1216 06:55:45.347858 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="extract-utilities" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.347872 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="extract-utilities" Dec 16 06:55:45 crc kubenswrapper[4789]: E1216 06:55:45.347886 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32383d71-3226-46ea-9d69-c3ab1096ec2c" containerName="oauth-openshift" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.347895 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="32383d71-3226-46ea-9d69-c3ab1096ec2c" containerName="oauth-openshift" Dec 16 06:55:45 crc kubenswrapper[4789]: E1216 06:55:45.347904 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="extract-content" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.347933 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="extract-content" Dec 16 06:55:45 crc kubenswrapper[4789]: E1216 06:55:45.347951 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" containerName="installer" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.347962 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" containerName="installer" Dec 16 06:55:45 crc kubenswrapper[4789]: E1216 06:55:45.347980 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="registry-server" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.347989 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="registry-server" Dec 16 06:55:45 crc kubenswrapper[4789]: E1216 06:55:45.348001 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.348009 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.348130 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ad11d7-1675-46d0-9e72-10f68b56a823" containerName="installer" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.348150 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="32383d71-3226-46ea-9d69-c3ab1096ec2c" containerName="oauth-openshift" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.348163 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.348178 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a44056-0b8f-4209-b92d-cfb1110ba626" containerName="registry-server" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.348659 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.352488 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.352595 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.352617 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.352774 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.353132 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.353314 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.353477 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.353607 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.354065 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.356816 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.357436 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.358716 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.360340 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.367889 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fd4b96696-zfw92"] Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.368895 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.371029 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.376711 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476224 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-error\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476364 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-session\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476404 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdnfb\" (UniqueName: \"kubernetes.io/projected/d7130bd5-b2db-4ae3-859a-bb88be560a46-kube-api-access-bdnfb\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476431 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-audit-policies\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476463 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476502 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-login\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476520 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476551 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476579 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476603 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476625 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476651 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476671 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.476693 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7130bd5-b2db-4ae3-859a-bb88be560a46-audit-dir\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.495282 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.565135 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.577740 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.577813 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.577842 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.577872 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.577906 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7130bd5-b2db-4ae3-859a-bb88be560a46-audit-dir\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.577956 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-error\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.577996 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-session\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.578018 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-audit-policies\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.578043 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdnfb\" (UniqueName: \"kubernetes.io/projected/d7130bd5-b2db-4ae3-859a-bb88be560a46-kube-api-access-bdnfb\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.578074 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.578117 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.578139 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-login\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.578173 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.578202 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.579857 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.580642 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.581325 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-audit-policies\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.581740 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7130bd5-b2db-4ae3-859a-bb88be560a46-audit-dir\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.582204 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.585255 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.585762 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-login\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.585796 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.587431 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-template-error\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.592368 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.592375 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-session\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.593461 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.598613 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdnfb\" (UniqueName: \"kubernetes.io/projected/d7130bd5-b2db-4ae3-859a-bb88be560a46-kube-api-access-bdnfb\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:45 crc kubenswrapper[4789]: I1216 06:55:45.609500 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7130bd5-b2db-4ae3-859a-bb88be560a46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fd4b96696-zfw92\" (UID: \"d7130bd5-b2db-4ae3-859a-bb88be560a46\") " pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:45.644207 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:45.725201 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:45.770144 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:45.882208 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.052267 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.079840 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.297939 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.320803 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.356499 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.561007 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fd4b96696-zfw92"] Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.647686 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.695316 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.924032 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 06:55:46 crc kubenswrapper[4789]: I1216 06:55:46.966335 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" event={"ID":"d7130bd5-b2db-4ae3-859a-bb88be560a46","Type":"ContainerStarted","Data":"15bde28573908bf4545eff9dd7e478ceb264981279ac90f8d04d651170de053d"} Dec 16 06:55:47 crc kubenswrapper[4789]: I1216 06:55:47.971786 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" event={"ID":"d7130bd5-b2db-4ae3-859a-bb88be560a46","Type":"ContainerStarted","Data":"29949730323adabfdc860aa18e058f9cc48ab0a54be5709d780ff0d009d40e76"} Dec 16 06:55:47 crc kubenswrapper[4789]: I1216 06:55:47.972567 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:47 crc kubenswrapper[4789]: I1216 06:55:47.976459 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" Dec 16 06:55:47 crc kubenswrapper[4789]: I1216 06:55:47.991368 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5fd4b96696-zfw92" podStartSLOduration=53.991349519 podStartE2EDuration="53.991349519s" podCreationTimestamp="2025-12-16 06:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:55:47.989830541 +0000 UTC m=+286.251718170" watchObservedRunningTime="2025-12-16 06:55:47.991349519 +0000 UTC m=+286.253237148" Dec 16 06:55:48 crc kubenswrapper[4789]: I1216 06:55:48.978223 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 06:55:48 crc kubenswrapper[4789]: I1216 06:55:48.978508 4789 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1aada94693d2db0ef2ec52a0dc76a3d57ca7a5c3b5b18da14431aa5db07dbb6e" exitCode=137 Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.061587 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.061664 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.221653 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.221825 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.221997 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222068 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222137 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222195 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222219 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222311 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222330 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222873 4789 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222958 4789 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.222998 4789 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.223024 4789 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.231392 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.324041 4789 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.985578 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.985712 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 06:55:49 crc kubenswrapper[4789]: I1216 06:55:49.985725 4789 scope.go:117] "RemoveContainer" containerID="1aada94693d2db0ef2ec52a0dc76a3d57ca7a5c3b5b18da14431aa5db07dbb6e" Dec 16 06:55:50 crc kubenswrapper[4789]: I1216 06:55:50.114883 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.011231 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc"] Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.012284 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" containerID="cri-o://27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350" gracePeriod=30 Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.068107 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9875cff8f-b6vxd"] Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.068314 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" podUID="0a429877-f45a-4835-b947-f3f97ad199bd" containerName="controller-manager" containerID="cri-o://91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d" gracePeriod=30 Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.544670 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7487d8d6cb-622sc_7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6/route-controller-manager/0.log" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.544753 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.594860 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.712884 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-serving-cert\") pod \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.712971 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pnb4\" (UniqueName: \"kubernetes.io/projected/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-kube-api-access-9pnb4\") pod \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713011 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-config\") pod \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713039 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-client-ca\") pod \"0a429877-f45a-4835-b947-f3f97ad199bd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713065 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-config\") pod \"0a429877-f45a-4835-b947-f3f97ad199bd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713096 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a429877-f45a-4835-b947-f3f97ad199bd-serving-cert\") pod \"0a429877-f45a-4835-b947-f3f97ad199bd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713127 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-client-ca\") pod \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\" (UID: \"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713168 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbx8s\" (UniqueName: \"kubernetes.io/projected/0a429877-f45a-4835-b947-f3f97ad199bd-kube-api-access-jbx8s\") pod \"0a429877-f45a-4835-b947-f3f97ad199bd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713192 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-proxy-ca-bundles\") pod \"0a429877-f45a-4835-b947-f3f97ad199bd\" (UID: \"0a429877-f45a-4835-b947-f3f97ad199bd\") " Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713929 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a429877-f45a-4835-b947-f3f97ad199bd" (UID: "0a429877-f45a-4835-b947-f3f97ad199bd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.713960 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-config" (OuterVolumeSpecName: "config") pod "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" (UID: "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.714044 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0a429877-f45a-4835-b947-f3f97ad199bd" (UID: "0a429877-f45a-4835-b947-f3f97ad199bd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.714057 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" (UID: "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.714112 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-config" (OuterVolumeSpecName: "config") pod "0a429877-f45a-4835-b947-f3f97ad199bd" (UID: "0a429877-f45a-4835-b947-f3f97ad199bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.719194 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" (UID: "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.719353 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a429877-f45a-4835-b947-f3f97ad199bd-kube-api-access-jbx8s" (OuterVolumeSpecName: "kube-api-access-jbx8s") pod "0a429877-f45a-4835-b947-f3f97ad199bd" (UID: "0a429877-f45a-4835-b947-f3f97ad199bd"). InnerVolumeSpecName "kube-api-access-jbx8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.719402 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-kube-api-access-9pnb4" (OuterVolumeSpecName: "kube-api-access-9pnb4") pod "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" (UID: "7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6"). InnerVolumeSpecName "kube-api-access-9pnb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.719461 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a429877-f45a-4835-b947-f3f97ad199bd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a429877-f45a-4835-b947-f3f97ad199bd" (UID: "0a429877-f45a-4835-b947-f3f97ad199bd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814630 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814674 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pnb4\" (UniqueName: \"kubernetes.io/projected/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-kube-api-access-9pnb4\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814688 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814698 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814708 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814717 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a429877-f45a-4835-b947-f3f97ad199bd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814726 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814736 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbx8s\" (UniqueName: \"kubernetes.io/projected/0a429877-f45a-4835-b947-f3f97ad199bd-kube-api-access-jbx8s\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:55 crc kubenswrapper[4789]: I1216 06:55:55.814746 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a429877-f45a-4835-b947-f3f97ad199bd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.020435 4789 generic.go:334] "Generic (PLEG): container finished" podID="0a429877-f45a-4835-b947-f3f97ad199bd" containerID="91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d" exitCode=0 Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.020500 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.020512 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" event={"ID":"0a429877-f45a-4835-b947-f3f97ad199bd","Type":"ContainerDied","Data":"91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d"} Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.020541 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9875cff8f-b6vxd" event={"ID":"0a429877-f45a-4835-b947-f3f97ad199bd","Type":"ContainerDied","Data":"28b5b172cf0975cc285190012c68de3b74f7871cc7107078fac69b8b18423573"} Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.020565 4789 scope.go:117] "RemoveContainer" containerID="91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.023243 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7487d8d6cb-622sc_7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6/route-controller-manager/0.log" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.023296 4789 generic.go:334] "Generic (PLEG): container finished" podID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerID="27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350" exitCode=0 Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.023326 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" event={"ID":"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6","Type":"ContainerDied","Data":"27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350"} Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.023353 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" event={"ID":"7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6","Type":"ContainerDied","Data":"e941479964fd25469e9697a2d0e9d50f4b482c40c1d9c800b60d80a1e8c578ed"} Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.023364 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.041959 4789 scope.go:117] "RemoveContainer" containerID="91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d" Dec 16 06:55:56 crc kubenswrapper[4789]: E1216 06:55:56.042449 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d\": container with ID starting with 91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d not found: ID does not exist" containerID="91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.042488 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d"} err="failed to get container status \"91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d\": rpc error: code = NotFound desc = could not find container \"91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d\": container with ID starting with 91bc4073758be3064bc26c1d42f515b0b00500d4e6f69d53a17e777bfd36366d not found: ID does not exist" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.042509 4789 scope.go:117] "RemoveContainer" containerID="27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.052389 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9875cff8f-b6vxd"] Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.057743 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9875cff8f-b6vxd"] Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.060025 4789 scope.go:117] "RemoveContainer" containerID="0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.062954 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc"] Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.068563 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7487d8d6cb-622sc"] Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.080328 4789 scope.go:117] "RemoveContainer" containerID="27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350" Dec 16 06:55:56 crc kubenswrapper[4789]: E1216 06:55:56.080966 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350\": container with ID starting with 27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350 not found: ID does not exist" containerID="27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.081005 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350"} err="failed to get container status \"27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350\": rpc error: code = NotFound desc = could not find container \"27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350\": container with ID starting with 27033078887e7d68fa9dad20adb11c9db444f042a8ae2cdbdc1b5a63350f0350 not found: ID does not exist" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.081032 4789 scope.go:117] "RemoveContainer" containerID="0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f" Dec 16 06:55:56 crc kubenswrapper[4789]: E1216 06:55:56.081725 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f\": container with ID starting with 0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f not found: ID does not exist" containerID="0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.081779 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f"} err="failed to get container status \"0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f\": rpc error: code = NotFound desc = could not find container \"0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f\": container with ID starting with 0a9a51c12358e0caccce063f9b5b3ed63d90a60ccdc4b1e06e51a4195db7c21f not found: ID does not exist" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.114849 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a429877-f45a-4835-b947-f3f97ad199bd" path="/var/lib/kubelet/pods/0a429877-f45a-4835-b947-f3f97ad199bd/volumes" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.115390 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" path="/var/lib/kubelet/pods/7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6/volumes" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.355178 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-qvbnd"] Dec 16 06:55:56 crc kubenswrapper[4789]: E1216 06:55:56.355458 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.355477 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: E1216 06:55:56.355491 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.355502 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: E1216 06:55:56.355527 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a429877-f45a-4835-b947-f3f97ad199bd" containerName="controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.355535 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a429877-f45a-4835-b947-f3f97ad199bd" containerName="controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.355662 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.355692 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a429877-f45a-4835-b947-f3f97ad199bd" containerName="controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.355704 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed4c5b6-098f-4f9d-b890-1ba5cb82e4b6" containerName="route-controller-manager" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.356138 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.358212 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.358277 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.358273 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf"] Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.358967 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.358998 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.358279 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.359555 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.359655 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.363108 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.363244 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.363291 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.363396 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.363830 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.364422 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.369277 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf"] Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.370887 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.378240 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-qvbnd"] Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.477032 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523307 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-serving-cert\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523350 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jnnl\" (UniqueName: \"kubernetes.io/projected/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-kube-api-access-2jnnl\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523386 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-client-ca\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523429 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-client-ca\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523457 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-config\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523478 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/a1aa924b-2a1c-4386-9a7d-bddedb813166-kube-api-access-rm95q\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523497 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-proxy-ca-bundles\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523515 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1aa924b-2a1c-4386-9a7d-bddedb813166-serving-cert\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.523536 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-config\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624236 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-serving-cert\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624288 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jnnl\" (UniqueName: \"kubernetes.io/projected/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-kube-api-access-2jnnl\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624322 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-client-ca\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624367 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-client-ca\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624401 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-config\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624425 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/a1aa924b-2a1c-4386-9a7d-bddedb813166-kube-api-access-rm95q\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624449 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-proxy-ca-bundles\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624473 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1aa924b-2a1c-4386-9a7d-bddedb813166-serving-cert\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.624498 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-config\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.625507 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-client-ca\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.625507 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-client-ca\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.625748 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-config\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.625754 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-proxy-ca-bundles\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.625902 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-config\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.629289 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1aa924b-2a1c-4386-9a7d-bddedb813166-serving-cert\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.629367 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-serving-cert\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.647564 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jnnl\" (UniqueName: \"kubernetes.io/projected/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-kube-api-access-2jnnl\") pod \"controller-manager-86d46c7f65-qvbnd\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.653711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/a1aa924b-2a1c-4386-9a7d-bddedb813166-kube-api-access-rm95q\") pod \"route-controller-manager-5b5cd4fdff-pljmf\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.683309 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.692731 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:56 crc kubenswrapper[4789]: I1216 06:55:56.922693 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf"] Dec 16 06:55:57 crc kubenswrapper[4789]: I1216 06:55:57.030230 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" event={"ID":"a1aa924b-2a1c-4386-9a7d-bddedb813166","Type":"ContainerStarted","Data":"ce17800e60537238c150358ece9add7fd0fa966c17f57c9f56e3cd0e15fd3e29"} Dec 16 06:55:57 crc kubenswrapper[4789]: I1216 06:55:57.073240 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-qvbnd"] Dec 16 06:55:57 crc kubenswrapper[4789]: W1216 06:55:57.076905 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ad135d_4e4a_4fd6_abb6_ce686b670b8d.slice/crio-7414f5d0ef39e51658cb1375dec2bca824e862341f45b0acf63a8699cc90185f WatchSource:0}: Error finding container 7414f5d0ef39e51658cb1375dec2bca824e862341f45b0acf63a8699cc90185f: Status 404 returned error can't find the container with id 7414f5d0ef39e51658cb1375dec2bca824e862341f45b0acf63a8699cc90185f Dec 16 06:55:57 crc kubenswrapper[4789]: I1216 06:55:57.161180 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 06:55:58 crc kubenswrapper[4789]: I1216 06:55:58.038537 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" event={"ID":"a1aa924b-2a1c-4386-9a7d-bddedb813166","Type":"ContainerStarted","Data":"9d14072b489833cb1a4c26d0f7e40cae6b433fd5a5a6c44c22a1cc325ab0c627"} Dec 16 06:55:58 crc kubenswrapper[4789]: I1216 06:55:58.039305 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" event={"ID":"50ad135d-4e4a-4fd6-abb6-ce686b670b8d","Type":"ContainerStarted","Data":"7414f5d0ef39e51658cb1375dec2bca824e862341f45b0acf63a8699cc90185f"} Dec 16 06:55:59 crc kubenswrapper[4789]: I1216 06:55:59.048895 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" event={"ID":"50ad135d-4e4a-4fd6-abb6-ce686b670b8d","Type":"ContainerStarted","Data":"5308e4d291ca97b5f3a422683ad60353700136d0ad64c39d4ea083a50847f61c"} Dec 16 06:55:59 crc kubenswrapper[4789]: I1216 06:55:59.049236 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:59 crc kubenswrapper[4789]: I1216 06:55:59.049250 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:59 crc kubenswrapper[4789]: I1216 06:55:59.055667 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:55:59 crc kubenswrapper[4789]: I1216 06:55:59.055965 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:55:59 crc kubenswrapper[4789]: I1216 06:55:59.065659 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" podStartSLOduration=4.065637989 podStartE2EDuration="4.065637989s" podCreationTimestamp="2025-12-16 06:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:55:59.062637724 +0000 UTC m=+297.324525363" watchObservedRunningTime="2025-12-16 06:55:59.065637989 +0000 UTC m=+297.327525618" Dec 16 06:55:59 crc kubenswrapper[4789]: I1216 06:55:59.098260 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" podStartSLOduration=4.098238264 podStartE2EDuration="4.098238264s" podCreationTimestamp="2025-12-16 06:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:55:59.080836749 +0000 UTC m=+297.342724378" watchObservedRunningTime="2025-12-16 06:55:59.098238264 +0000 UTC m=+297.360125893" Dec 16 06:56:05 crc kubenswrapper[4789]: I1216 06:56:05.673547 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 06:56:07 crc kubenswrapper[4789]: I1216 06:56:07.914336 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 06:56:10 crc kubenswrapper[4789]: I1216 06:56:10.322039 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 06:56:11 crc kubenswrapper[4789]: I1216 06:56:11.109647 4789 generic.go:334] "Generic (PLEG): container finished" podID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerID="44c5f1b7d671e9f5747dc3175b4a2f29c9eabc568087d64d8da5d40c5895913b" exitCode=0 Dec 16 06:56:11 crc kubenswrapper[4789]: I1216 06:56:11.109949 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" event={"ID":"03d51736-0f2b-4c40-b6f1-ee44fa4312f9","Type":"ContainerDied","Data":"44c5f1b7d671e9f5747dc3175b4a2f29c9eabc568087d64d8da5d40c5895913b"} Dec 16 06:56:11 crc kubenswrapper[4789]: I1216 06:56:11.110339 4789 scope.go:117] "RemoveContainer" containerID="44c5f1b7d671e9f5747dc3175b4a2f29c9eabc568087d64d8da5d40c5895913b" Dec 16 06:56:11 crc kubenswrapper[4789]: I1216 06:56:11.887791 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-qvbnd"] Dec 16 06:56:11 crc kubenswrapper[4789]: I1216 06:56:11.888038 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" podUID="50ad135d-4e4a-4fd6-abb6-ce686b670b8d" containerName="controller-manager" containerID="cri-o://5308e4d291ca97b5f3a422683ad60353700136d0ad64c39d4ea083a50847f61c" gracePeriod=30 Dec 16 06:56:11 crc kubenswrapper[4789]: I1216 06:56:11.902340 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf"] Dec 16 06:56:11 crc kubenswrapper[4789]: I1216 06:56:11.902584 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" podUID="a1aa924b-2a1c-4386-9a7d-bddedb813166" containerName="route-controller-manager" containerID="cri-o://9d14072b489833cb1a4c26d0f7e40cae6b433fd5a5a6c44c22a1cc325ab0c627" gracePeriod=30 Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.130894 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" event={"ID":"03d51736-0f2b-4c40-b6f1-ee44fa4312f9","Type":"ContainerStarted","Data":"3adc11389a094fb049402fc6d63bd295fd729bf28285b84a7e8e507b240706fd"} Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.131227 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.132495 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.133484 4789 generic.go:334] "Generic (PLEG): container finished" podID="a1aa924b-2a1c-4386-9a7d-bddedb813166" containerID="9d14072b489833cb1a4c26d0f7e40cae6b433fd5a5a6c44c22a1cc325ab0c627" exitCode=0 Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.133510 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" event={"ID":"a1aa924b-2a1c-4386-9a7d-bddedb813166","Type":"ContainerDied","Data":"9d14072b489833cb1a4c26d0f7e40cae6b433fd5a5a6c44c22a1cc325ab0c627"} Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.138848 4789 generic.go:334] "Generic (PLEG): container finished" podID="50ad135d-4e4a-4fd6-abb6-ce686b670b8d" containerID="5308e4d291ca97b5f3a422683ad60353700136d0ad64c39d4ea083a50847f61c" exitCode=0 Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.138889 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" event={"ID":"50ad135d-4e4a-4fd6-abb6-ce686b670b8d","Type":"ContainerDied","Data":"5308e4d291ca97b5f3a422683ad60353700136d0ad64c39d4ea083a50847f61c"} Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.455529 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.543767 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.632206 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-client-ca\") pod \"a1aa924b-2a1c-4386-9a7d-bddedb813166\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.632254 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1aa924b-2a1c-4386-9a7d-bddedb813166-serving-cert\") pod \"a1aa924b-2a1c-4386-9a7d-bddedb813166\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.632321 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-config\") pod \"a1aa924b-2a1c-4386-9a7d-bddedb813166\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.632362 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/a1aa924b-2a1c-4386-9a7d-bddedb813166-kube-api-access-rm95q\") pod \"a1aa924b-2a1c-4386-9a7d-bddedb813166\" (UID: \"a1aa924b-2a1c-4386-9a7d-bddedb813166\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.633192 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-client-ca" (OuterVolumeSpecName: "client-ca") pod "a1aa924b-2a1c-4386-9a7d-bddedb813166" (UID: "a1aa924b-2a1c-4386-9a7d-bddedb813166"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.633328 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-config" (OuterVolumeSpecName: "config") pod "a1aa924b-2a1c-4386-9a7d-bddedb813166" (UID: "a1aa924b-2a1c-4386-9a7d-bddedb813166"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.637477 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1aa924b-2a1c-4386-9a7d-bddedb813166-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a1aa924b-2a1c-4386-9a7d-bddedb813166" (UID: "a1aa924b-2a1c-4386-9a7d-bddedb813166"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.637576 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1aa924b-2a1c-4386-9a7d-bddedb813166-kube-api-access-rm95q" (OuterVolumeSpecName: "kube-api-access-rm95q") pod "a1aa924b-2a1c-4386-9a7d-bddedb813166" (UID: "a1aa924b-2a1c-4386-9a7d-bddedb813166"). InnerVolumeSpecName "kube-api-access-rm95q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733236 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-serving-cert\") pod \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733329 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-proxy-ca-bundles\") pod \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733367 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jnnl\" (UniqueName: \"kubernetes.io/projected/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-kube-api-access-2jnnl\") pod \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733392 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-config\") pod \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733444 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-client-ca\") pod \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\" (UID: \"50ad135d-4e4a-4fd6-abb6-ce686b670b8d\") " Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733646 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733658 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1aa924b-2a1c-4386-9a7d-bddedb813166-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733666 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1aa924b-2a1c-4386-9a7d-bddedb813166-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.733674 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/a1aa924b-2a1c-4386-9a7d-bddedb813166-kube-api-access-rm95q\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.734310 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "50ad135d-4e4a-4fd6-abb6-ce686b670b8d" (UID: "50ad135d-4e4a-4fd6-abb6-ce686b670b8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.734488 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "50ad135d-4e4a-4fd6-abb6-ce686b670b8d" (UID: "50ad135d-4e4a-4fd6-abb6-ce686b670b8d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.734726 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-config" (OuterVolumeSpecName: "config") pod "50ad135d-4e4a-4fd6-abb6-ce686b670b8d" (UID: "50ad135d-4e4a-4fd6-abb6-ce686b670b8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.736379 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "50ad135d-4e4a-4fd6-abb6-ce686b670b8d" (UID: "50ad135d-4e4a-4fd6-abb6-ce686b670b8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.736435 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-kube-api-access-2jnnl" (OuterVolumeSpecName: "kube-api-access-2jnnl") pod "50ad135d-4e4a-4fd6-abb6-ce686b670b8d" (UID: "50ad135d-4e4a-4fd6-abb6-ce686b670b8d"). InnerVolumeSpecName "kube-api-access-2jnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.834490 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.834807 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.834819 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jnnl\" (UniqueName: \"kubernetes.io/projected/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-kube-api-access-2jnnl\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.834828 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:12 crc kubenswrapper[4789]: I1216 06:56:12.834839 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50ad135d-4e4a-4fd6-abb6-ce686b670b8d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.152900 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" event={"ID":"a1aa924b-2a1c-4386-9a7d-bddedb813166","Type":"ContainerDied","Data":"ce17800e60537238c150358ece9add7fd0fa966c17f57c9f56e3cd0e15fd3e29"} Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.153000 4789 scope.go:117] "RemoveContainer" containerID="9d14072b489833cb1a4c26d0f7e40cae6b433fd5a5a6c44c22a1cc325ab0c627" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.154394 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.154603 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" event={"ID":"50ad135d-4e4a-4fd6-abb6-ce686b670b8d","Type":"ContainerDied","Data":"7414f5d0ef39e51658cb1375dec2bca824e862341f45b0acf63a8699cc90185f"} Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.154768 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d46c7f65-qvbnd" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.174972 4789 scope.go:117] "RemoveContainer" containerID="5308e4d291ca97b5f3a422683ad60353700136d0ad64c39d4ea083a50847f61c" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.178460 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-qvbnd"] Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.181790 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-qvbnd"] Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.189708 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf"] Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.196208 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-pljmf"] Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.373780 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54dff54f66-gtd8p"] Dec 16 06:56:13 crc kubenswrapper[4789]: E1216 06:56:13.374060 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1aa924b-2a1c-4386-9a7d-bddedb813166" containerName="route-controller-manager" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.374074 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1aa924b-2a1c-4386-9a7d-bddedb813166" containerName="route-controller-manager" Dec 16 06:56:13 crc kubenswrapper[4789]: E1216 06:56:13.374087 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ad135d-4e4a-4fd6-abb6-ce686b670b8d" containerName="controller-manager" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.374093 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ad135d-4e4a-4fd6-abb6-ce686b670b8d" containerName="controller-manager" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.374187 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ad135d-4e4a-4fd6-abb6-ce686b670b8d" containerName="controller-manager" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.374199 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1aa924b-2a1c-4386-9a7d-bddedb813166" containerName="route-controller-manager" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.374577 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.377089 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.377202 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.377557 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.378031 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.378214 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.379189 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.384683 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l"] Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.385286 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.385353 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.387105 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.387443 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.387565 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.387728 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.387835 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.388044 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.389702 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54dff54f66-gtd8p"] Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.395655 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l"] Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542614 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-serving-cert\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542661 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkcg\" (UniqueName: \"kubernetes.io/projected/46bdfecb-0151-4f51-8751-9beee9096606-kube-api-access-qgkcg\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542702 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-config\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542727 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-client-ca\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542750 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-client-ca\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542767 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-config\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542782 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bdfecb-0151-4f51-8751-9beee9096606-serving-cert\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.542867 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl4cf\" (UniqueName: \"kubernetes.io/projected/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-kube-api-access-wl4cf\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.543022 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-proxy-ca-bundles\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.643853 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-config\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.643904 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-client-ca\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.643944 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-client-ca\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.644880 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-client-ca\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.643970 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-config\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.644952 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-client-ca\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.644975 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bdfecb-0151-4f51-8751-9beee9096606-serving-cert\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.645547 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl4cf\" (UniqueName: \"kubernetes.io/projected/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-kube-api-access-wl4cf\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.645584 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-proxy-ca-bundles\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.645604 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-serving-cert\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.645629 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkcg\" (UniqueName: \"kubernetes.io/projected/46bdfecb-0151-4f51-8751-9beee9096606-kube-api-access-qgkcg\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.646342 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-config\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.646430 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-proxy-ca-bundles\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.648998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-serving-cert\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.649522 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bdfecb-0151-4f51-8751-9beee9096606-serving-cert\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.651275 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-config\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.663623 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl4cf\" (UniqueName: \"kubernetes.io/projected/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-kube-api-access-wl4cf\") pod \"route-controller-manager-54c7d5f475-7qj4l\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.664072 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkcg\" (UniqueName: \"kubernetes.io/projected/46bdfecb-0151-4f51-8751-9beee9096606-kube-api-access-qgkcg\") pod \"controller-manager-54dff54f66-gtd8p\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.701492 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:13 crc kubenswrapper[4789]: I1216 06:56:13.725878 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:14 crc kubenswrapper[4789]: I1216 06:56:14.114319 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ad135d-4e4a-4fd6-abb6-ce686b670b8d" path="/var/lib/kubelet/pods/50ad135d-4e4a-4fd6-abb6-ce686b670b8d/volumes" Dec 16 06:56:14 crc kubenswrapper[4789]: I1216 06:56:14.115200 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1aa924b-2a1c-4386-9a7d-bddedb813166" path="/var/lib/kubelet/pods/a1aa924b-2a1c-4386-9a7d-bddedb813166/volumes" Dec 16 06:56:14 crc kubenswrapper[4789]: I1216 06:56:14.115808 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54dff54f66-gtd8p"] Dec 16 06:56:14 crc kubenswrapper[4789]: I1216 06:56:14.148057 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l"] Dec 16 06:56:14 crc kubenswrapper[4789]: W1216 06:56:14.164063 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd99f6e_8024_4192_8955_c6dd37cb7ae7.slice/crio-d7a3d00b34fcf54ba1c574ce75b4a40556ced6aac1852245e5c4d25f71f1fa89 WatchSource:0}: Error finding container d7a3d00b34fcf54ba1c574ce75b4a40556ced6aac1852245e5c4d25f71f1fa89: Status 404 returned error can't find the container with id d7a3d00b34fcf54ba1c574ce75b4a40556ced6aac1852245e5c4d25f71f1fa89 Dec 16 06:56:14 crc kubenswrapper[4789]: I1216 06:56:14.169352 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" event={"ID":"46bdfecb-0151-4f51-8751-9beee9096606","Type":"ContainerStarted","Data":"c9753ce09b1cfd0651330430f02173b1d8b073d195d7425574ee737588c49dca"} Dec 16 06:56:14 crc kubenswrapper[4789]: I1216 06:56:14.347122 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.190349 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" event={"ID":"46bdfecb-0151-4f51-8751-9beee9096606","Type":"ContainerStarted","Data":"ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15"} Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.191420 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.192870 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" event={"ID":"fbd99f6e-8024-4192-8955-c6dd37cb7ae7","Type":"ContainerStarted","Data":"516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704"} Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.192900 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" event={"ID":"fbd99f6e-8024-4192-8955-c6dd37cb7ae7","Type":"ContainerStarted","Data":"d7a3d00b34fcf54ba1c574ce75b4a40556ced6aac1852245e5c4d25f71f1fa89"} Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.193229 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.194982 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.199703 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.212850 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" podStartSLOduration=4.21282963 podStartE2EDuration="4.21282963s" podCreationTimestamp="2025-12-16 06:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:15.206668616 +0000 UTC m=+313.468556235" watchObservedRunningTime="2025-12-16 06:56:15.21282963 +0000 UTC m=+313.474717259" Dec 16 06:56:15 crc kubenswrapper[4789]: I1216 06:56:15.226306 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" podStartSLOduration=4.226283756 podStartE2EDuration="4.226283756s" podCreationTimestamp="2025-12-16 06:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:15.226101032 +0000 UTC m=+313.487988681" watchObservedRunningTime="2025-12-16 06:56:15.226283756 +0000 UTC m=+313.488171405" Dec 16 06:56:16 crc kubenswrapper[4789]: I1216 06:56:16.153023 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 06:56:51 crc kubenswrapper[4789]: I1216 06:56:51.907418 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l"] Dec 16 06:56:51 crc kubenswrapper[4789]: I1216 06:56:51.908276 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" podUID="fbd99f6e-8024-4192-8955-c6dd37cb7ae7" containerName="route-controller-manager" containerID="cri-o://516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704" gracePeriod=30 Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.774465 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.910548 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-serving-cert\") pod \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.911246 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl4cf\" (UniqueName: \"kubernetes.io/projected/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-kube-api-access-wl4cf\") pod \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.911438 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-client-ca\") pod \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.911475 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-config\") pod \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\" (UID: \"fbd99f6e-8024-4192-8955-c6dd37cb7ae7\") " Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.912233 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-client-ca" (OuterVolumeSpecName: "client-ca") pod "fbd99f6e-8024-4192-8955-c6dd37cb7ae7" (UID: "fbd99f6e-8024-4192-8955-c6dd37cb7ae7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.912238 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-config" (OuterVolumeSpecName: "config") pod "fbd99f6e-8024-4192-8955-c6dd37cb7ae7" (UID: "fbd99f6e-8024-4192-8955-c6dd37cb7ae7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.915533 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fbd99f6e-8024-4192-8955-c6dd37cb7ae7" (UID: "fbd99f6e-8024-4192-8955-c6dd37cb7ae7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:56:52 crc kubenswrapper[4789]: I1216 06:56:52.920716 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-kube-api-access-wl4cf" (OuterVolumeSpecName: "kube-api-access-wl4cf") pod "fbd99f6e-8024-4192-8955-c6dd37cb7ae7" (UID: "fbd99f6e-8024-4192-8955-c6dd37cb7ae7"). InnerVolumeSpecName "kube-api-access-wl4cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.014238 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.014305 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.014319 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl4cf\" (UniqueName: \"kubernetes.io/projected/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-kube-api-access-wl4cf\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.014331 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbd99f6e-8024-4192-8955-c6dd37cb7ae7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.377334 4789 generic.go:334] "Generic (PLEG): container finished" podID="fbd99f6e-8024-4192-8955-c6dd37cb7ae7" containerID="516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704" exitCode=0 Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.377385 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" event={"ID":"fbd99f6e-8024-4192-8955-c6dd37cb7ae7","Type":"ContainerDied","Data":"516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704"} Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.377652 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" event={"ID":"fbd99f6e-8024-4192-8955-c6dd37cb7ae7","Type":"ContainerDied","Data":"d7a3d00b34fcf54ba1c574ce75b4a40556ced6aac1852245e5c4d25f71f1fa89"} Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.377721 4789 scope.go:117] "RemoveContainer" containerID="516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.377406 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.397928 4789 scope.go:117] "RemoveContainer" containerID="516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704" Dec 16 06:56:53 crc kubenswrapper[4789]: E1216 06:56:53.398351 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704\": container with ID starting with 516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704 not found: ID does not exist" containerID="516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.398385 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704"} err="failed to get container status \"516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704\": rpc error: code = NotFound desc = could not find container \"516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704\": container with ID starting with 516681ee7245de09d622334d52c1c93c71d3e58d8fe18c35b1cfea82f0ee7704 not found: ID does not exist" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.398927 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr"] Dec 16 06:56:53 crc kubenswrapper[4789]: E1216 06:56:53.399194 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbd99f6e-8024-4192-8955-c6dd37cb7ae7" containerName="route-controller-manager" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.399214 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd99f6e-8024-4192-8955-c6dd37cb7ae7" containerName="route-controller-manager" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.399338 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbd99f6e-8024-4192-8955-c6dd37cb7ae7" containerName="route-controller-manager" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.400115 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.403780 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.404089 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.404386 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.404424 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.404517 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.404556 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.406295 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l"] Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.412866 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54c7d5f475-7qj4l"] Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.416444 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr"] Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.520086 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/918f18c5-7b54-4db1-93f1-e3416b21d1ee-client-ca\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.520130 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfct4\" (UniqueName: \"kubernetes.io/projected/918f18c5-7b54-4db1-93f1-e3416b21d1ee-kube-api-access-mfct4\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.520162 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918f18c5-7b54-4db1-93f1-e3416b21d1ee-config\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.520234 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918f18c5-7b54-4db1-93f1-e3416b21d1ee-serving-cert\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.621567 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918f18c5-7b54-4db1-93f1-e3416b21d1ee-serving-cert\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.621668 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfct4\" (UniqueName: \"kubernetes.io/projected/918f18c5-7b54-4db1-93f1-e3416b21d1ee-kube-api-access-mfct4\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.621692 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/918f18c5-7b54-4db1-93f1-e3416b21d1ee-client-ca\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.621729 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918f18c5-7b54-4db1-93f1-e3416b21d1ee-config\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.622722 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/918f18c5-7b54-4db1-93f1-e3416b21d1ee-client-ca\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.623062 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918f18c5-7b54-4db1-93f1-e3416b21d1ee-config\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.627429 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918f18c5-7b54-4db1-93f1-e3416b21d1ee-serving-cert\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.641137 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfct4\" (UniqueName: \"kubernetes.io/projected/918f18c5-7b54-4db1-93f1-e3416b21d1ee-kube-api-access-mfct4\") pod \"route-controller-manager-5b5cd4fdff-mvgfr\" (UID: \"918f18c5-7b54-4db1-93f1-e3416b21d1ee\") " pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:53 crc kubenswrapper[4789]: I1216 06:56:53.721245 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:54 crc kubenswrapper[4789]: I1216 06:56:54.112546 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd99f6e-8024-4192-8955-c6dd37cb7ae7" path="/var/lib/kubelet/pods/fbd99f6e-8024-4192-8955-c6dd37cb7ae7/volumes" Dec 16 06:56:54 crc kubenswrapper[4789]: I1216 06:56:54.160612 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr"] Dec 16 06:56:54 crc kubenswrapper[4789]: I1216 06:56:54.384105 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" event={"ID":"918f18c5-7b54-4db1-93f1-e3416b21d1ee","Type":"ContainerStarted","Data":"d5e18634357cc514e8fdb834d15c9f863b2bc16c49cf7db3cbbf4e8fc7e6e1fa"} Dec 16 06:56:55 crc kubenswrapper[4789]: I1216 06:56:55.390662 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" event={"ID":"918f18c5-7b54-4db1-93f1-e3416b21d1ee","Type":"ContainerStarted","Data":"0a8df96b359e5999a3ef9b15d8ef83c1b8b983026814414ddc22575b1340f7c2"} Dec 16 06:56:55 crc kubenswrapper[4789]: I1216 06:56:55.392148 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:55 crc kubenswrapper[4789]: I1216 06:56:55.397742 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" Dec 16 06:56:55 crc kubenswrapper[4789]: I1216 06:56:55.413128 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b5cd4fdff-mvgfr" podStartSLOduration=4.413114727 podStartE2EDuration="4.413114727s" podCreationTimestamp="2025-12-16 06:56:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:56:55.412154594 +0000 UTC m=+353.674042223" watchObservedRunningTime="2025-12-16 06:56:55.413114727 +0000 UTC m=+353.675002356" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.803166 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4pcc"] Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.804278 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.821300 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4pcc"] Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.970502 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa990bad-2336-4c51-8688-7f393a161493-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.970858 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-bound-sa-token\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.971032 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-registry-tls\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.971162 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfft\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-kube-api-access-mbfft\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.971300 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa990bad-2336-4c51-8688-7f393a161493-trusted-ca\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.971345 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.971377 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa990bad-2336-4c51-8688-7f393a161493-registry-certificates\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.971540 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa990bad-2336-4c51-8688-7f393a161493-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:05 crc kubenswrapper[4789]: I1216 06:57:05.998652 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.072171 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa990bad-2336-4c51-8688-7f393a161493-trusted-ca\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.072239 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa990bad-2336-4c51-8688-7f393a161493-registry-certificates\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.072288 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa990bad-2336-4c51-8688-7f393a161493-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.072314 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa990bad-2336-4c51-8688-7f393a161493-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.072330 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-bound-sa-token\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.072349 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-registry-tls\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.072366 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfft\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-kube-api-access-mbfft\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.073204 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa990bad-2336-4c51-8688-7f393a161493-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.073696 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa990bad-2336-4c51-8688-7f393a161493-registry-certificates\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.074276 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa990bad-2336-4c51-8688-7f393a161493-trusted-ca\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.078469 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-registry-tls\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.078510 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa990bad-2336-4c51-8688-7f393a161493-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.089404 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-bound-sa-token\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.090351 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfft\" (UniqueName: \"kubernetes.io/projected/fa990bad-2336-4c51-8688-7f393a161493-kube-api-access-mbfft\") pod \"image-registry-66df7c8f76-w4pcc\" (UID: \"fa990bad-2336-4c51-8688-7f393a161493\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.121358 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:06 crc kubenswrapper[4789]: I1216 06:57:06.505605 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4pcc"] Dec 16 06:57:06 crc kubenswrapper[4789]: W1216 06:57:06.508807 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa990bad_2336_4c51_8688_7f393a161493.slice/crio-2b158193cd1b9ae5b4a0b6f2f72ae27f79ab076f481eed5ae580ad6f6dea4f72 WatchSource:0}: Error finding container 2b158193cd1b9ae5b4a0b6f2f72ae27f79ab076f481eed5ae580ad6f6dea4f72: Status 404 returned error can't find the container with id 2b158193cd1b9ae5b4a0b6f2f72ae27f79ab076f481eed5ae580ad6f6dea4f72 Dec 16 06:57:07 crc kubenswrapper[4789]: I1216 06:57:07.458381 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" event={"ID":"fa990bad-2336-4c51-8688-7f393a161493","Type":"ContainerStarted","Data":"cb7fac3151d612846ef3cb61b0b53d865f065210bce17a154ba6aa2d3b4a1be8"} Dec 16 06:57:07 crc kubenswrapper[4789]: I1216 06:57:07.458710 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" event={"ID":"fa990bad-2336-4c51-8688-7f393a161493","Type":"ContainerStarted","Data":"2b158193cd1b9ae5b4a0b6f2f72ae27f79ab076f481eed5ae580ad6f6dea4f72"} Dec 16 06:57:07 crc kubenswrapper[4789]: I1216 06:57:07.458730 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:11 crc kubenswrapper[4789]: I1216 06:57:11.900465 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" podStartSLOduration=6.900440803 podStartE2EDuration="6.900440803s" podCreationTimestamp="2025-12-16 06:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:07.480341915 +0000 UTC m=+365.742229554" watchObservedRunningTime="2025-12-16 06:57:11.900440803 +0000 UTC m=+370.162328442" Dec 16 06:57:11 crc kubenswrapper[4789]: I1216 06:57:11.902721 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54dff54f66-gtd8p"] Dec 16 06:57:11 crc kubenswrapper[4789]: I1216 06:57:11.902980 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" podUID="46bdfecb-0151-4f51-8751-9beee9096606" containerName="controller-manager" containerID="cri-o://ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15" gracePeriod=30 Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.383498 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.431549 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-bzz2s"] Dec 16 06:57:13 crc kubenswrapper[4789]: E1216 06:57:13.431819 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bdfecb-0151-4f51-8751-9beee9096606" containerName="controller-manager" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.431852 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bdfecb-0151-4f51-8751-9beee9096606" containerName="controller-manager" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.432009 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bdfecb-0151-4f51-8751-9beee9096606" containerName="controller-manager" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.432453 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.456588 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-bzz2s"] Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.470339 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-config\") pod \"46bdfecb-0151-4f51-8751-9beee9096606\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.470386 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkcg\" (UniqueName: \"kubernetes.io/projected/46bdfecb-0151-4f51-8751-9beee9096606-kube-api-access-qgkcg\") pod \"46bdfecb-0151-4f51-8751-9beee9096606\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.470408 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bdfecb-0151-4f51-8751-9beee9096606-serving-cert\") pod \"46bdfecb-0151-4f51-8751-9beee9096606\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.470423 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-proxy-ca-bundles\") pod \"46bdfecb-0151-4f51-8751-9beee9096606\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.470452 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-client-ca\") pod \"46bdfecb-0151-4f51-8751-9beee9096606\" (UID: \"46bdfecb-0151-4f51-8751-9beee9096606\") " Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.471433 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-client-ca" (OuterVolumeSpecName: "client-ca") pod "46bdfecb-0151-4f51-8751-9beee9096606" (UID: "46bdfecb-0151-4f51-8751-9beee9096606"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.471489 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-config" (OuterVolumeSpecName: "config") pod "46bdfecb-0151-4f51-8751-9beee9096606" (UID: "46bdfecb-0151-4f51-8751-9beee9096606"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.471894 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "46bdfecb-0151-4f51-8751-9beee9096606" (UID: "46bdfecb-0151-4f51-8751-9beee9096606"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.485290 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bdfecb-0151-4f51-8751-9beee9096606-kube-api-access-qgkcg" (OuterVolumeSpecName: "kube-api-access-qgkcg") pod "46bdfecb-0151-4f51-8751-9beee9096606" (UID: "46bdfecb-0151-4f51-8751-9beee9096606"). InnerVolumeSpecName "kube-api-access-qgkcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.495378 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bdfecb-0151-4f51-8751-9beee9096606-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "46bdfecb-0151-4f51-8751-9beee9096606" (UID: "46bdfecb-0151-4f51-8751-9beee9096606"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.499747 4789 generic.go:334] "Generic (PLEG): container finished" podID="46bdfecb-0151-4f51-8751-9beee9096606" containerID="ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15" exitCode=0 Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.499975 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" event={"ID":"46bdfecb-0151-4f51-8751-9beee9096606","Type":"ContainerDied","Data":"ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15"} Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.500072 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" event={"ID":"46bdfecb-0151-4f51-8751-9beee9096606","Type":"ContainerDied","Data":"c9753ce09b1cfd0651330430f02173b1d8b073d195d7425574ee737588c49dca"} Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.500151 4789 scope.go:117] "RemoveContainer" containerID="ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.500371 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54dff54f66-gtd8p" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.529277 4789 scope.go:117] "RemoveContainer" containerID="ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15" Dec 16 06:57:13 crc kubenswrapper[4789]: E1216 06:57:13.529705 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15\": container with ID starting with ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15 not found: ID does not exist" containerID="ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.529728 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15"} err="failed to get container status \"ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15\": rpc error: code = NotFound desc = could not find container \"ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15\": container with ID starting with ae9acb471aaa87e7bf4f878cda852711a20e89e73d7958840a9a510952539e15 not found: ID does not exist" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.541540 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54dff54f66-gtd8p"] Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.545097 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54dff54f66-gtd8p"] Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.571887 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-client-ca\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572002 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-config\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572134 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fztjk\" (UniqueName: \"kubernetes.io/projected/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-kube-api-access-fztjk\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572209 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-proxy-ca-bundles\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572280 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-serving-cert\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572371 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-config\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572395 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkcg\" (UniqueName: \"kubernetes.io/projected/46bdfecb-0151-4f51-8751-9beee9096606-kube-api-access-qgkcg\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572407 4789 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46bdfecb-0151-4f51-8751-9beee9096606-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572421 4789 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.572433 4789 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46bdfecb-0151-4f51-8751-9beee9096606-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.673305 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fztjk\" (UniqueName: \"kubernetes.io/projected/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-kube-api-access-fztjk\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.673363 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-proxy-ca-bundles\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.673406 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-serving-cert\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.673467 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-client-ca\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.673492 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-config\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.674706 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-client-ca\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.674903 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-config\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.675076 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-proxy-ca-bundles\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.677618 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-serving-cert\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.703865 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fztjk\" (UniqueName: \"kubernetes.io/projected/ed2bb9cb-cf65-449b-81fe-3209d8b689c0-kube-api-access-fztjk\") pod \"controller-manager-86d46c7f65-bzz2s\" (UID: \"ed2bb9cb-cf65-449b-81fe-3209d8b689c0\") " pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.757731 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:13 crc kubenswrapper[4789]: I1216 06:57:13.942856 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d46c7f65-bzz2s"] Dec 16 06:57:14 crc kubenswrapper[4789]: I1216 06:57:14.113124 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46bdfecb-0151-4f51-8751-9beee9096606" path="/var/lib/kubelet/pods/46bdfecb-0151-4f51-8751-9beee9096606/volumes" Dec 16 06:57:14 crc kubenswrapper[4789]: I1216 06:57:14.506988 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" event={"ID":"ed2bb9cb-cf65-449b-81fe-3209d8b689c0","Type":"ContainerStarted","Data":"7ebe60c12d69fad20c0cbe76266f566200485ca89bcf8cd1aab7980f673d94ab"} Dec 16 06:57:16 crc kubenswrapper[4789]: I1216 06:57:16.517548 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" event={"ID":"ed2bb9cb-cf65-449b-81fe-3209d8b689c0","Type":"ContainerStarted","Data":"70ecf1ce4e9140827aceb7294bba29ecd935b593a3dee0b8c439b2b3a1c8ee74"} Dec 16 06:57:16 crc kubenswrapper[4789]: I1216 06:57:16.517900 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:16 crc kubenswrapper[4789]: I1216 06:57:16.521863 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" Dec 16 06:57:16 crc kubenswrapper[4789]: I1216 06:57:16.545844 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86d46c7f65-bzz2s" podStartSLOduration=5.545825436 podStartE2EDuration="5.545825436s" podCreationTimestamp="2025-12-16 06:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:16.540908606 +0000 UTC m=+374.802796235" watchObservedRunningTime="2025-12-16 06:57:16.545825436 +0000 UTC m=+374.807713065" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.360507 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5r7k7"] Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.361247 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5r7k7" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="registry-server" containerID="cri-o://9e6722da7678a0a614b638dfbbb4ce49a52d6e2d4325100a7f6e39ce88f0c3f2" gracePeriod=30 Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.367483 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btjcd"] Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.367755 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-btjcd" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="registry-server" containerID="cri-o://d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d" gracePeriod=30 Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.373477 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmw44"] Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.374057 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" containerID="cri-o://3adc11389a094fb049402fc6d63bd295fd729bf28285b84a7e8e507b240706fd" gracePeriod=30 Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.390888 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wdfg"] Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.391173 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5wdfg" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="registry-server" containerID="cri-o://000b76426f38d969384bb9190479f5c6282b00345280198570f0dc0333fc4a75" gracePeriod=30 Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.394851 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xktm2"] Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.395539 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.404230 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xktm2"] Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.407664 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw2sm"] Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.407956 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lw2sm" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="registry-server" containerID="cri-o://dc68cf5edf6a22837f52641014946cbc974bddf7d3d7cffed5287128ea44d580" gracePeriod=30 Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.555374 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26b72fe5-4aad-4c74-917c-9333d34ea481-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.555452 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8qb\" (UniqueName: \"kubernetes.io/projected/26b72fe5-4aad-4c74-917c-9333d34ea481-kube-api-access-hv8qb\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.555493 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26b72fe5-4aad-4c74-917c-9333d34ea481-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.657200 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26b72fe5-4aad-4c74-917c-9333d34ea481-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.657236 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8qb\" (UniqueName: \"kubernetes.io/projected/26b72fe5-4aad-4c74-917c-9333d34ea481-kube-api-access-hv8qb\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.657293 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26b72fe5-4aad-4c74-917c-9333d34ea481-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.659699 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26b72fe5-4aad-4c74-917c-9333d34ea481-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.664307 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/26b72fe5-4aad-4c74-917c-9333d34ea481-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.674873 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8qb\" (UniqueName: \"kubernetes.io/projected/26b72fe5-4aad-4c74-917c-9333d34ea481-kube-api-access-hv8qb\") pod \"marketplace-operator-79b997595-xktm2\" (UID: \"26b72fe5-4aad-4c74-917c-9333d34ea481\") " pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:18 crc kubenswrapper[4789]: I1216 06:57:18.713696 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.123332 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xktm2"] Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.361419 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.467151 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-utilities\") pod \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.467499 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-catalog-content\") pod \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.467534 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8gcv\" (UniqueName: \"kubernetes.io/projected/8a620056-2e2e-46ae-9a32-c8aea4b297c4-kube-api-access-m8gcv\") pod \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\" (UID: \"8a620056-2e2e-46ae-9a32-c8aea4b297c4\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.467990 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-utilities" (OuterVolumeSpecName: "utilities") pod "8a620056-2e2e-46ae-9a32-c8aea4b297c4" (UID: "8a620056-2e2e-46ae-9a32-c8aea4b297c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.482129 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a620056-2e2e-46ae-9a32-c8aea4b297c4-kube-api-access-m8gcv" (OuterVolumeSpecName: "kube-api-access-m8gcv") pod "8a620056-2e2e-46ae-9a32-c8aea4b297c4" (UID: "8a620056-2e2e-46ae-9a32-c8aea4b297c4"). InnerVolumeSpecName "kube-api-access-m8gcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.526043 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a620056-2e2e-46ae-9a32-c8aea4b297c4" (UID: "8a620056-2e2e-46ae-9a32-c8aea4b297c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.542736 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" event={"ID":"26b72fe5-4aad-4c74-917c-9333d34ea481","Type":"ContainerStarted","Data":"e03d0739a92c360636cd2190ff60a8c0d6807670fffcd2921f754d5e4bc9f7b0"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.543001 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.543126 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" event={"ID":"26b72fe5-4aad-4c74-917c-9333d34ea481","Type":"ContainerStarted","Data":"df410482b6661d4d5dc04c73cc0547879aeee31719c14af68dd196397df90403"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.544761 4789 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xktm2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" start-of-body= Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.544816 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" podUID="26b72fe5-4aad-4c74-917c-9333d34ea481" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.545560 4789 generic.go:334] "Generic (PLEG): container finished" podID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerID="d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d" exitCode=0 Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.545757 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btjcd" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.553547 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btjcd" event={"ID":"8a620056-2e2e-46ae-9a32-c8aea4b297c4","Type":"ContainerDied","Data":"d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.553628 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btjcd" event={"ID":"8a620056-2e2e-46ae-9a32-c8aea4b297c4","Type":"ContainerDied","Data":"0500c04f6352d976e18daadcf259e8ca2fe642164bc0f2493aec73fa72e25044"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.553656 4789 scope.go:117] "RemoveContainer" containerID="d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.562934 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" podStartSLOduration=1.562882206 podStartE2EDuration="1.562882206s" podCreationTimestamp="2025-12-16 06:57:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 06:57:19.557618907 +0000 UTC m=+377.819506536" watchObservedRunningTime="2025-12-16 06:57:19.562882206 +0000 UTC m=+377.824769855" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.573807 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.573853 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a620056-2e2e-46ae-9a32-c8aea4b297c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.573870 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8gcv\" (UniqueName: \"kubernetes.io/projected/8a620056-2e2e-46ae-9a32-c8aea4b297c4-kube-api-access-m8gcv\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.574372 4789 generic.go:334] "Generic (PLEG): container finished" podID="9af33cc0-7e86-482a-b3a1-89df07600676" containerID="9e6722da7678a0a614b638dfbbb4ce49a52d6e2d4325100a7f6e39ce88f0c3f2" exitCode=0 Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.574507 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7k7" event={"ID":"9af33cc0-7e86-482a-b3a1-89df07600676","Type":"ContainerDied","Data":"9e6722da7678a0a614b638dfbbb4ce49a52d6e2d4325100a7f6e39ce88f0c3f2"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.581252 4789 generic.go:334] "Generic (PLEG): container finished" podID="706aa82a-2280-4715-919c-a480a2a81f8d" containerID="000b76426f38d969384bb9190479f5c6282b00345280198570f0dc0333fc4a75" exitCode=0 Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.581283 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wdfg" event={"ID":"706aa82a-2280-4715-919c-a480a2a81f8d","Type":"ContainerDied","Data":"000b76426f38d969384bb9190479f5c6282b00345280198570f0dc0333fc4a75"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.581351 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wdfg" event={"ID":"706aa82a-2280-4715-919c-a480a2a81f8d","Type":"ContainerDied","Data":"3d42e9d8217933efc87c18af42abbf56f4c0775814b6daa2e0aa407fe21a5abe"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.581363 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d42e9d8217933efc87c18af42abbf56f4c0775814b6daa2e0aa407fe21a5abe" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.586269 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerID="dc68cf5edf6a22837f52641014946cbc974bddf7d3d7cffed5287128ea44d580" exitCode=0 Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.587121 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw2sm" event={"ID":"ca1b1396-77d5-4d37-a72b-fcb9591cbf40","Type":"ContainerDied","Data":"dc68cf5edf6a22837f52641014946cbc974bddf7d3d7cffed5287128ea44d580"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.587187 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw2sm" event={"ID":"ca1b1396-77d5-4d37-a72b-fcb9591cbf40","Type":"ContainerDied","Data":"fd9c5fb4e313087368a94e12220776fa5456f4892c7dfc0eb59b4659714562ea"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.587468 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd9c5fb4e313087368a94e12220776fa5456f4892c7dfc0eb59b4659714562ea" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.590487 4789 generic.go:334] "Generic (PLEG): container finished" podID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerID="3adc11389a094fb049402fc6d63bd295fd729bf28285b84a7e8e507b240706fd" exitCode=0 Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.590529 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" event={"ID":"03d51736-0f2b-4c40-b6f1-ee44fa4312f9","Type":"ContainerDied","Data":"3adc11389a094fb049402fc6d63bd295fd729bf28285b84a7e8e507b240706fd"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.590559 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" event={"ID":"03d51736-0f2b-4c40-b6f1-ee44fa4312f9","Type":"ContainerDied","Data":"699850b79f63a71b434d40c2b1039ce98122c0dfb399474b77366ee5246773fa"} Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.590573 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="699850b79f63a71b434d40c2b1039ce98122c0dfb399474b77366ee5246773fa" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.599060 4789 scope.go:117] "RemoveContainer" containerID="8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.614391 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.616675 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.616985 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btjcd"] Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.618752 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.621627 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-btjcd"] Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.645665 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.647854 4789 scope.go:117] "RemoveContainer" containerID="873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.676682 4789 scope.go:117] "RemoveContainer" containerID="d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d" Dec 16 06:57:19 crc kubenswrapper[4789]: E1216 06:57:19.677145 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d\": container with ID starting with d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d not found: ID does not exist" containerID="d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.677172 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d"} err="failed to get container status \"d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d\": rpc error: code = NotFound desc = could not find container \"d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d\": container with ID starting with d1aa9010f2edd3f292ee3a6d80c878758c0eab403f10008d3bb70f6650df516d not found: ID does not exist" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.677193 4789 scope.go:117] "RemoveContainer" containerID="8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76" Dec 16 06:57:19 crc kubenswrapper[4789]: E1216 06:57:19.678549 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76\": container with ID starting with 8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76 not found: ID does not exist" containerID="8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.678640 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76"} err="failed to get container status \"8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76\": rpc error: code = NotFound desc = could not find container \"8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76\": container with ID starting with 8f957563f5d9d5f93e71b5e0eb2974b46a974eeee529a95caa007db783ae7a76 not found: ID does not exist" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.678682 4789 scope.go:117] "RemoveContainer" containerID="873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534" Dec 16 06:57:19 crc kubenswrapper[4789]: E1216 06:57:19.679120 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534\": container with ID starting with 873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534 not found: ID does not exist" containerID="873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.679150 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534"} err="failed to get container status \"873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534\": rpc error: code = NotFound desc = could not find container \"873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534\": container with ID starting with 873aafe60a19d7afb179995fcfbc58d056914145611f7cebe0b40c021730f534 not found: ID does not exist" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.679212 4789 scope.go:117] "RemoveContainer" containerID="44c5f1b7d671e9f5747dc3175b4a2f29c9eabc568087d64d8da5d40c5895913b" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.775879 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fqtq\" (UniqueName: \"kubernetes.io/projected/706aa82a-2280-4715-919c-a480a2a81f8d-kube-api-access-8fqtq\") pod \"706aa82a-2280-4715-919c-a480a2a81f8d\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.775946 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-utilities\") pod \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.775985 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-trusted-ca\") pod \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.775999 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fcct\" (UniqueName: \"kubernetes.io/projected/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-kube-api-access-9fcct\") pod \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776023 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-catalog-content\") pod \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\" (UID: \"ca1b1396-77d5-4d37-a72b-fcb9591cbf40\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776043 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-operator-metrics\") pod \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776088 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj6h8\" (UniqueName: \"kubernetes.io/projected/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-kube-api-access-nj6h8\") pod \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\" (UID: \"03d51736-0f2b-4c40-b6f1-ee44fa4312f9\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776123 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-catalog-content\") pod \"706aa82a-2280-4715-919c-a480a2a81f8d\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776156 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-utilities\") pod \"706aa82a-2280-4715-919c-a480a2a81f8d\" (UID: \"706aa82a-2280-4715-919c-a480a2a81f8d\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776182 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bblh\" (UniqueName: \"kubernetes.io/projected/9af33cc0-7e86-482a-b3a1-89df07600676-kube-api-access-5bblh\") pod \"9af33cc0-7e86-482a-b3a1-89df07600676\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776200 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-catalog-content\") pod \"9af33cc0-7e86-482a-b3a1-89df07600676\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776226 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-utilities\") pod \"9af33cc0-7e86-482a-b3a1-89df07600676\" (UID: \"9af33cc0-7e86-482a-b3a1-89df07600676\") " Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.776721 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-utilities" (OuterVolumeSpecName: "utilities") pod "ca1b1396-77d5-4d37-a72b-fcb9591cbf40" (UID: "ca1b1396-77d5-4d37-a72b-fcb9591cbf40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.777208 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-utilities" (OuterVolumeSpecName: "utilities") pod "9af33cc0-7e86-482a-b3a1-89df07600676" (UID: "9af33cc0-7e86-482a-b3a1-89df07600676"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.778678 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "03d51736-0f2b-4c40-b6f1-ee44fa4312f9" (UID: "03d51736-0f2b-4c40-b6f1-ee44fa4312f9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.780058 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-kube-api-access-nj6h8" (OuterVolumeSpecName: "kube-api-access-nj6h8") pod "03d51736-0f2b-4c40-b6f1-ee44fa4312f9" (UID: "03d51736-0f2b-4c40-b6f1-ee44fa4312f9"). InnerVolumeSpecName "kube-api-access-nj6h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.780172 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706aa82a-2280-4715-919c-a480a2a81f8d-kube-api-access-8fqtq" (OuterVolumeSpecName: "kube-api-access-8fqtq") pod "706aa82a-2280-4715-919c-a480a2a81f8d" (UID: "706aa82a-2280-4715-919c-a480a2a81f8d"). InnerVolumeSpecName "kube-api-access-8fqtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.780651 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af33cc0-7e86-482a-b3a1-89df07600676-kube-api-access-5bblh" (OuterVolumeSpecName: "kube-api-access-5bblh") pod "9af33cc0-7e86-482a-b3a1-89df07600676" (UID: "9af33cc0-7e86-482a-b3a1-89df07600676"). InnerVolumeSpecName "kube-api-access-5bblh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.780716 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-utilities" (OuterVolumeSpecName: "utilities") pod "706aa82a-2280-4715-919c-a480a2a81f8d" (UID: "706aa82a-2280-4715-919c-a480a2a81f8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.781299 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "03d51736-0f2b-4c40-b6f1-ee44fa4312f9" (UID: "03d51736-0f2b-4c40-b6f1-ee44fa4312f9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.791179 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-kube-api-access-9fcct" (OuterVolumeSpecName: "kube-api-access-9fcct") pod "ca1b1396-77d5-4d37-a72b-fcb9591cbf40" (UID: "ca1b1396-77d5-4d37-a72b-fcb9591cbf40"). InnerVolumeSpecName "kube-api-access-9fcct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.797405 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "706aa82a-2280-4715-919c-a480a2a81f8d" (UID: "706aa82a-2280-4715-919c-a480a2a81f8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.842970 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9af33cc0-7e86-482a-b3a1-89df07600676" (UID: "9af33cc0-7e86-482a-b3a1-89df07600676"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877881 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877935 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fqtq\" (UniqueName: \"kubernetes.io/projected/706aa82a-2280-4715-919c-a480a2a81f8d-kube-api-access-8fqtq\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877949 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877959 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fcct\" (UniqueName: \"kubernetes.io/projected/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-kube-api-access-9fcct\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877968 4789 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877977 4789 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877985 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj6h8\" (UniqueName: \"kubernetes.io/projected/03d51736-0f2b-4c40-b6f1-ee44fa4312f9-kube-api-access-nj6h8\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.877994 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.878001 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706aa82a-2280-4715-919c-a480a2a81f8d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.878009 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bblh\" (UniqueName: \"kubernetes.io/projected/9af33cc0-7e86-482a-b3a1-89df07600676-kube-api-access-5bblh\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.878017 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af33cc0-7e86-482a-b3a1-89df07600676-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.920019 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca1b1396-77d5-4d37-a72b-fcb9591cbf40" (UID: "ca1b1396-77d5-4d37-a72b-fcb9591cbf40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:19 crc kubenswrapper[4789]: I1216 06:57:19.979386 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1b1396-77d5-4d37-a72b-fcb9591cbf40-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.111696 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" path="/var/lib/kubelet/pods/8a620056-2e2e-46ae-9a32-c8aea4b297c4/volumes" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.595440 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cmw44" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.600344 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wdfg" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.600997 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5r7k7" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.601351 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5r7k7" event={"ID":"9af33cc0-7e86-482a-b3a1-89df07600676","Type":"ContainerDied","Data":"d35afea1d0dfa60d63d3f88635c6bec1d5234cc2970104a6a72e1d9275a8e752"} Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.601397 4789 scope.go:117] "RemoveContainer" containerID="9e6722da7678a0a614b638dfbbb4ce49a52d6e2d4325100a7f6e39ce88f0c3f2" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.601479 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw2sm" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.605436 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xktm2" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.628673 4789 scope.go:117] "RemoveContainer" containerID="37b071419a4552b47eb17fd41953d763d9960eea7fd05b9e441944d3490406c4" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.635896 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wdfg"] Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.639132 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wdfg"] Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.642464 4789 scope.go:117] "RemoveContainer" containerID="869b58f98046f7050c25b7f1658ad22c03a664cdad95b89562eaddce0216ee9d" Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.673085 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw2sm"] Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.673141 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lw2sm"] Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.685766 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5r7k7"] Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.697182 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5r7k7"] Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.702193 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmw44"] Dec 16 06:57:20 crc kubenswrapper[4789]: I1216 06:57:20.712657 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cmw44"] Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.815420 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jfd8n"] Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.815885 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.815900 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.815931 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.815940 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.815951 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.815959 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.815969 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.815976 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.815987 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.815994 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816005 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816012 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816022 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816029 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816038 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816045 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="extract-utilities" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816055 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816061 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816071 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816080 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816088 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816095 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="extract-content" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816104 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816111 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816120 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816127 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" Dec 16 06:57:21 crc kubenswrapper[4789]: E1216 06:57:21.816140 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816148 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816252 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a620056-2e2e-46ae-9a32-c8aea4b297c4" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816263 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816275 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" containerName="marketplace-operator" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816285 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816296 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.816306 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" containerName="registry-server" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.817188 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.820146 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.823541 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfd8n"] Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.906546 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-catalog-content\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.906596 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jft\" (UniqueName: \"kubernetes.io/projected/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-kube-api-access-t6jft\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.906650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-utilities\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.927455 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:57:21 crc kubenswrapper[4789]: I1216 06:57:21.927518 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.008851 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-catalog-content\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.008972 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jft\" (UniqueName: \"kubernetes.io/projected/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-kube-api-access-t6jft\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.009019 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-utilities\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.009470 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-utilities\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.009740 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-catalog-content\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.017191 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m5xm2"] Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.018181 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.020659 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.029046 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5xm2"] Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.037515 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jft\" (UniqueName: \"kubernetes.io/projected/e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420-kube-api-access-t6jft\") pod \"redhat-marketplace-jfd8n\" (UID: \"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420\") " pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.110192 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-catalog-content\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.110296 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-utilities\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.110491 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrc8c\" (UniqueName: \"kubernetes.io/projected/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-kube-api-access-qrc8c\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.113772 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d51736-0f2b-4c40-b6f1-ee44fa4312f9" path="/var/lib/kubelet/pods/03d51736-0f2b-4c40-b6f1-ee44fa4312f9/volumes" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.114510 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706aa82a-2280-4715-919c-a480a2a81f8d" path="/var/lib/kubelet/pods/706aa82a-2280-4715-919c-a480a2a81f8d/volumes" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.115585 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af33cc0-7e86-482a-b3a1-89df07600676" path="/var/lib/kubelet/pods/9af33cc0-7e86-482a-b3a1-89df07600676/volumes" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.117023 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1b1396-77d5-4d37-a72b-fcb9591cbf40" path="/var/lib/kubelet/pods/ca1b1396-77d5-4d37-a72b-fcb9591cbf40/volumes" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.139818 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.212164 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-utilities\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.212530 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrc8c\" (UniqueName: \"kubernetes.io/projected/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-kube-api-access-qrc8c\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.212647 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-catalog-content\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.213158 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-catalog-content\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.213416 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-utilities\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.235578 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrc8c\" (UniqueName: \"kubernetes.io/projected/f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4-kube-api-access-qrc8c\") pod \"certified-operators-m5xm2\" (UID: \"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4\") " pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.360506 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.554835 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfd8n"] Dec 16 06:57:22 crc kubenswrapper[4789]: W1216 06:57:22.561882 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode28c35f9_1d8d_4e8c_9fc8_6e9b239a0420.slice/crio-53360ec69abab062ade1a3331df6d23102a2afd22e0d5e63ec7bef508c23b5f5 WatchSource:0}: Error finding container 53360ec69abab062ade1a3331df6d23102a2afd22e0d5e63ec7bef508c23b5f5: Status 404 returned error can't find the container with id 53360ec69abab062ade1a3331df6d23102a2afd22e0d5e63ec7bef508c23b5f5 Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.612652 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfd8n" event={"ID":"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420","Type":"ContainerStarted","Data":"53360ec69abab062ade1a3331df6d23102a2afd22e0d5e63ec7bef508c23b5f5"} Dec 16 06:57:22 crc kubenswrapper[4789]: I1216 06:57:22.750329 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5xm2"] Dec 16 06:57:22 crc kubenswrapper[4789]: W1216 06:57:22.765982 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf40bfb8a_79f1_4f9b_adfb_e4b93628e2d4.slice/crio-6724eeea943fab2bc666ea2f142c1d05bf3d4954d3cf70e751883eae01bd33a9 WatchSource:0}: Error finding container 6724eeea943fab2bc666ea2f142c1d05bf3d4954d3cf70e751883eae01bd33a9: Status 404 returned error can't find the container with id 6724eeea943fab2bc666ea2f142c1d05bf3d4954d3cf70e751883eae01bd33a9 Dec 16 06:57:23 crc kubenswrapper[4789]: I1216 06:57:23.619140 4789 generic.go:334] "Generic (PLEG): container finished" podID="e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420" containerID="f87803c9d41a74126b6d86195ddc42cb82e91a18d7b60653b28eae722ff5850c" exitCode=0 Dec 16 06:57:23 crc kubenswrapper[4789]: I1216 06:57:23.619188 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfd8n" event={"ID":"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420","Type":"ContainerDied","Data":"f87803c9d41a74126b6d86195ddc42cb82e91a18d7b60653b28eae722ff5850c"} Dec 16 06:57:23 crc kubenswrapper[4789]: I1216 06:57:23.621413 4789 generic.go:334] "Generic (PLEG): container finished" podID="f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4" containerID="711dc6b01ad049f73241356d20b91eebdc23b8d119e6db472b58388abac68603" exitCode=0 Dec 16 06:57:23 crc kubenswrapper[4789]: I1216 06:57:23.621454 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5xm2" event={"ID":"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4","Type":"ContainerDied","Data":"711dc6b01ad049f73241356d20b91eebdc23b8d119e6db472b58388abac68603"} Dec 16 06:57:23 crc kubenswrapper[4789]: I1216 06:57:23.621480 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5xm2" event={"ID":"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4","Type":"ContainerStarted","Data":"6724eeea943fab2bc666ea2f142c1d05bf3d4954d3cf70e751883eae01bd33a9"} Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.212442 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2jhz"] Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.213405 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.215846 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.223767 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2jhz"] Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.340875 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-utilities\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.340966 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-catalog-content\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.341001 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7clzt\" (UniqueName: \"kubernetes.io/projected/63c09acc-7922-4c06-b5ee-74d87e7e9d80-kube-api-access-7clzt\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.412456 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ncql8"] Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.413674 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.416529 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.420084 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncql8"] Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.442363 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-utilities\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.442409 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-catalog-content\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.442437 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7clzt\" (UniqueName: \"kubernetes.io/projected/63c09acc-7922-4c06-b5ee-74d87e7e9d80-kube-api-access-7clzt\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.442893 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-catalog-content\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.442959 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-utilities\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.462933 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7clzt\" (UniqueName: \"kubernetes.io/projected/63c09acc-7922-4c06-b5ee-74d87e7e9d80-kube-api-access-7clzt\") pod \"community-operators-l2jhz\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.532465 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.543641 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-catalog-content\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.543686 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-utilities\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.543749 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkd6g\" (UniqueName: \"kubernetes.io/projected/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-kube-api-access-bkd6g\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.644757 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-utilities\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.645213 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkd6g\" (UniqueName: \"kubernetes.io/projected/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-kube-api-access-bkd6g\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.645290 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-catalog-content\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.646313 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-catalog-content\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.646711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-utilities\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.663786 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkd6g\" (UniqueName: \"kubernetes.io/projected/ce2801ec-57a1-436f-afeb-bc9fac03ec0a-kube-api-access-bkd6g\") pod \"redhat-operators-ncql8\" (UID: \"ce2801ec-57a1-436f-afeb-bc9fac03ec0a\") " pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:24 crc kubenswrapper[4789]: I1216 06:57:24.735431 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:25 crc kubenswrapper[4789]: I1216 06:57:25.030538 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2jhz"] Dec 16 06:57:25 crc kubenswrapper[4789]: I1216 06:57:25.136218 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncql8"] Dec 16 06:57:25 crc kubenswrapper[4789]: W1216 06:57:25.141022 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce2801ec_57a1_436f_afeb_bc9fac03ec0a.slice/crio-404b28368e5cda6dbacc0ec70961a9afc0f82176d636aea32d6d9d8d4ab2fcde WatchSource:0}: Error finding container 404b28368e5cda6dbacc0ec70961a9afc0f82176d636aea32d6d9d8d4ab2fcde: Status 404 returned error can't find the container with id 404b28368e5cda6dbacc0ec70961a9afc0f82176d636aea32d6d9d8d4ab2fcde Dec 16 06:57:25 crc kubenswrapper[4789]: I1216 06:57:25.631153 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncql8" event={"ID":"ce2801ec-57a1-436f-afeb-bc9fac03ec0a","Type":"ContainerStarted","Data":"404b28368e5cda6dbacc0ec70961a9afc0f82176d636aea32d6d9d8d4ab2fcde"} Dec 16 06:57:25 crc kubenswrapper[4789]: I1216 06:57:25.632049 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2jhz" event={"ID":"63c09acc-7922-4c06-b5ee-74d87e7e9d80","Type":"ContainerStarted","Data":"db487e877d244460fe3474349185646a0653dcf460d80bac31e4b193dd7b29bf"} Dec 16 06:57:26 crc kubenswrapper[4789]: I1216 06:57:26.126122 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w4pcc" Dec 16 06:57:26 crc kubenswrapper[4789]: I1216 06:57:26.176540 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5786"] Dec 16 06:57:29 crc kubenswrapper[4789]: I1216 06:57:29.654141 4789 generic.go:334] "Generic (PLEG): container finished" podID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerID="9a07630bd2effed51a4bb91caa8ca12366a39692f665d9c41c191227e4e27799" exitCode=0 Dec 16 06:57:29 crc kubenswrapper[4789]: I1216 06:57:29.654231 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2jhz" event={"ID":"63c09acc-7922-4c06-b5ee-74d87e7e9d80","Type":"ContainerDied","Data":"9a07630bd2effed51a4bb91caa8ca12366a39692f665d9c41c191227e4e27799"} Dec 16 06:57:29 crc kubenswrapper[4789]: I1216 06:57:29.657151 4789 generic.go:334] "Generic (PLEG): container finished" podID="ce2801ec-57a1-436f-afeb-bc9fac03ec0a" containerID="e612bd6a5973b038811f1d3a2c420e75df3464e017df9d6846670b65dbe3fb0e" exitCode=0 Dec 16 06:57:29 crc kubenswrapper[4789]: I1216 06:57:29.657221 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncql8" event={"ID":"ce2801ec-57a1-436f-afeb-bc9fac03ec0a","Type":"ContainerDied","Data":"e612bd6a5973b038811f1d3a2c420e75df3464e017df9d6846670b65dbe3fb0e"} Dec 16 06:57:33 crc kubenswrapper[4789]: I1216 06:57:33.679423 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncql8" event={"ID":"ce2801ec-57a1-436f-afeb-bc9fac03ec0a","Type":"ContainerStarted","Data":"0e1f4c4f5507a85490d81c3c1f1fd32810c6cf610748264d3466e378ac71dd10"} Dec 16 06:57:33 crc kubenswrapper[4789]: I1216 06:57:33.681795 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfd8n" event={"ID":"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420","Type":"ContainerStarted","Data":"a4ed9f4e2e7304b9bba7dc5756f4a6f2d7f1ece844dc8fa42ba4f2d108f37ecf"} Dec 16 06:57:33 crc kubenswrapper[4789]: I1216 06:57:33.683812 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5xm2" event={"ID":"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4","Type":"ContainerStarted","Data":"a934760bc37be85b86b08d7afdd51e617c498e318f3695ab1c15010a8960f7d3"} Dec 16 06:57:35 crc kubenswrapper[4789]: E1216 06:57:35.926270 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf40bfb8a_79f1_4f9b_adfb_e4b93628e2d4.slice/crio-conmon-a934760bc37be85b86b08d7afdd51e617c498e318f3695ab1c15010a8960f7d3.scope\": RecentStats: unable to find data in memory cache]" Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.702552 4789 generic.go:334] "Generic (PLEG): container finished" podID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerID="516bf647bdef8ec0cb28524c73ee2f42321aced10951d7b193b997b0769973a9" exitCode=0 Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.702721 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2jhz" event={"ID":"63c09acc-7922-4c06-b5ee-74d87e7e9d80","Type":"ContainerDied","Data":"516bf647bdef8ec0cb28524c73ee2f42321aced10951d7b193b997b0769973a9"} Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.705154 4789 generic.go:334] "Generic (PLEG): container finished" podID="e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420" containerID="a4ed9f4e2e7304b9bba7dc5756f4a6f2d7f1ece844dc8fa42ba4f2d108f37ecf" exitCode=0 Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.705185 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfd8n" event={"ID":"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420","Type":"ContainerDied","Data":"a4ed9f4e2e7304b9bba7dc5756f4a6f2d7f1ece844dc8fa42ba4f2d108f37ecf"} Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.708493 4789 generic.go:334] "Generic (PLEG): container finished" podID="f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4" containerID="a934760bc37be85b86b08d7afdd51e617c498e318f3695ab1c15010a8960f7d3" exitCode=0 Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.708559 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5xm2" event={"ID":"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4","Type":"ContainerDied","Data":"a934760bc37be85b86b08d7afdd51e617c498e318f3695ab1c15010a8960f7d3"} Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.712360 4789 generic.go:334] "Generic (PLEG): container finished" podID="ce2801ec-57a1-436f-afeb-bc9fac03ec0a" containerID="0e1f4c4f5507a85490d81c3c1f1fd32810c6cf610748264d3466e378ac71dd10" exitCode=0 Dec 16 06:57:36 crc kubenswrapper[4789]: I1216 06:57:36.712399 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncql8" event={"ID":"ce2801ec-57a1-436f-afeb-bc9fac03ec0a","Type":"ContainerDied","Data":"0e1f4c4f5507a85490d81c3c1f1fd32810c6cf610748264d3466e378ac71dd10"} Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.720850 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2jhz" event={"ID":"63c09acc-7922-4c06-b5ee-74d87e7e9d80","Type":"ContainerStarted","Data":"08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c"} Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.724226 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5xm2" event={"ID":"f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4","Type":"ContainerStarted","Data":"f90ad239530f4875e273218a7c74b5559c9857fb773a0f3d85d3cb10828f4613"} Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.726678 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncql8" event={"ID":"ce2801ec-57a1-436f-afeb-bc9fac03ec0a","Type":"ContainerStarted","Data":"cb53a72d287e40fed4ca69d77ad091c788c64d36832ea41274246b0553353c49"} Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.729230 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfd8n" event={"ID":"e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420","Type":"ContainerStarted","Data":"4db8d87cfa8e6a0566eb63cca339895de8ce2e7f3f3c55a64b6c9fab06dbffd4"} Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.755623 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2jhz" podStartSLOduration=7.322697158 podStartE2EDuration="13.7556004s" podCreationTimestamp="2025-12-16 06:57:24 +0000 UTC" firstStartedPulling="2025-12-16 06:57:30.869533711 +0000 UTC m=+389.131421340" lastFinishedPulling="2025-12-16 06:57:37.302436943 +0000 UTC m=+395.564324582" observedRunningTime="2025-12-16 06:57:37.748691511 +0000 UTC m=+396.010579150" watchObservedRunningTime="2025-12-16 06:57:37.7556004 +0000 UTC m=+396.017488029" Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.774982 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ncql8" podStartSLOduration=6.836341346 podStartE2EDuration="13.774961577s" podCreationTimestamp="2025-12-16 06:57:24 +0000 UTC" firstStartedPulling="2025-12-16 06:57:30.335322921 +0000 UTC m=+388.597210550" lastFinishedPulling="2025-12-16 06:57:37.273943152 +0000 UTC m=+395.535830781" observedRunningTime="2025-12-16 06:57:37.774661599 +0000 UTC m=+396.036549238" watchObservedRunningTime="2025-12-16 06:57:37.774961577 +0000 UTC m=+396.036849206" Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.797209 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m5xm2" podStartSLOduration=3.101894434 podStartE2EDuration="16.797187068s" podCreationTimestamp="2025-12-16 06:57:21 +0000 UTC" firstStartedPulling="2025-12-16 06:57:23.622691797 +0000 UTC m=+381.884579426" lastFinishedPulling="2025-12-16 06:57:37.317984431 +0000 UTC m=+395.579872060" observedRunningTime="2025-12-16 06:57:37.793363718 +0000 UTC m=+396.055251357" watchObservedRunningTime="2025-12-16 06:57:37.797187068 +0000 UTC m=+396.059074717" Dec 16 06:57:37 crc kubenswrapper[4789]: I1216 06:57:37.822839 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jfd8n" podStartSLOduration=3.05783508 podStartE2EDuration="16.822818757s" podCreationTimestamp="2025-12-16 06:57:21 +0000 UTC" firstStartedPulling="2025-12-16 06:57:23.621502868 +0000 UTC m=+381.883390497" lastFinishedPulling="2025-12-16 06:57:37.386486545 +0000 UTC m=+395.648374174" observedRunningTime="2025-12-16 06:57:37.816325999 +0000 UTC m=+396.078213648" watchObservedRunningTime="2025-12-16 06:57:37.822818757 +0000 UTC m=+396.084706396" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.140307 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.140620 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.185346 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.360693 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.360744 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.396854 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.802787 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jfd8n" Dec 16 06:57:42 crc kubenswrapper[4789]: I1216 06:57:42.811773 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m5xm2" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.534840 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.536223 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.570295 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.736003 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.736059 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.788382 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.831714 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 06:57:44 crc kubenswrapper[4789]: I1216 06:57:44.832155 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ncql8" Dec 16 06:57:51 crc kubenswrapper[4789]: I1216 06:57:51.226480 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" podUID="be028739-1351-4883-95ec-35fb89831c72" containerName="registry" containerID="cri-o://7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9" gracePeriod=30 Dec 16 06:57:51 crc kubenswrapper[4789]: I1216 06:57:51.927471 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:57:51 crc kubenswrapper[4789]: I1216 06:57:51.927541 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.546474 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611233 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88r4l\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-kube-api-access-88r4l\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611372 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611434 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-trusted-ca\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611471 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be028739-1351-4883-95ec-35fb89831c72-ca-trust-extracted\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611507 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-bound-sa-token\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611527 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-registry-tls\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611557 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be028739-1351-4883-95ec-35fb89831c72-installation-pull-secrets\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.611579 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-registry-certificates\") pod \"be028739-1351-4883-95ec-35fb89831c72\" (UID: \"be028739-1351-4883-95ec-35fb89831c72\") " Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.612679 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.613124 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.618260 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.618868 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-kube-api-access-88r4l" (OuterVolumeSpecName: "kube-api-access-88r4l") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "kube-api-access-88r4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.620320 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.622873 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be028739-1351-4883-95ec-35fb89831c72-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.623108 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.632509 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be028739-1351-4883-95ec-35fb89831c72-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "be028739-1351-4883-95ec-35fb89831c72" (UID: "be028739-1351-4883-95ec-35fb89831c72"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.713391 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.713442 4789 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/be028739-1351-4883-95ec-35fb89831c72-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.713461 4789 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.713525 4789 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.713549 4789 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/be028739-1351-4883-95ec-35fb89831c72-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.713571 4789 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/be028739-1351-4883-95ec-35fb89831c72-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.713587 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88r4l\" (UniqueName: \"kubernetes.io/projected/be028739-1351-4883-95ec-35fb89831c72-kube-api-access-88r4l\") on node \"crc\" DevicePath \"\"" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.826233 4789 generic.go:334] "Generic (PLEG): container finished" podID="be028739-1351-4883-95ec-35fb89831c72" containerID="7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9" exitCode=0 Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.826284 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.826331 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" event={"ID":"be028739-1351-4883-95ec-35fb89831c72","Type":"ContainerDied","Data":"7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9"} Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.826733 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-p5786" event={"ID":"be028739-1351-4883-95ec-35fb89831c72","Type":"ContainerDied","Data":"8f51d0c1db093f198653e4c032c99d097e45ae41066d6d75da56e63ba42df4c3"} Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.826763 4789 scope.go:117] "RemoveContainer" containerID="7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.843773 4789 scope.go:117] "RemoveContainer" containerID="7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9" Dec 16 06:57:53 crc kubenswrapper[4789]: E1216 06:57:53.844317 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9\": container with ID starting with 7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9 not found: ID does not exist" containerID="7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.844353 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9"} err="failed to get container status \"7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9\": rpc error: code = NotFound desc = could not find container \"7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9\": container with ID starting with 7eb551ae7a5a4704e51377a906ce09cb58c6b8ea9015db1d9ec1ac43e7cdd5e9 not found: ID does not exist" Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.857550 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5786"] Dec 16 06:57:53 crc kubenswrapper[4789]: I1216 06:57:53.861762 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-p5786"] Dec 16 06:57:54 crc kubenswrapper[4789]: I1216 06:57:54.111127 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be028739-1351-4883-95ec-35fb89831c72" path="/var/lib/kubelet/pods/be028739-1351-4883-95ec-35fb89831c72/volumes" Dec 16 06:58:21 crc kubenswrapper[4789]: I1216 06:58:21.928104 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 06:58:21 crc kubenswrapper[4789]: I1216 06:58:21.928990 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 06:58:21 crc kubenswrapper[4789]: I1216 06:58:21.929053 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 06:58:21 crc kubenswrapper[4789]: I1216 06:58:21.929732 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d12421f385572b5f49ec16ce5dc368fcce4c0b47f4845aad6327275ef658245"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 06:58:21 crc kubenswrapper[4789]: I1216 06:58:21.929818 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://2d12421f385572b5f49ec16ce5dc368fcce4c0b47f4845aad6327275ef658245" gracePeriod=600 Dec 16 06:58:22 crc kubenswrapper[4789]: I1216 06:58:22.216604 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="2d12421f385572b5f49ec16ce5dc368fcce4c0b47f4845aad6327275ef658245" exitCode=0 Dec 16 06:58:22 crc kubenswrapper[4789]: I1216 06:58:22.216660 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"2d12421f385572b5f49ec16ce5dc368fcce4c0b47f4845aad6327275ef658245"} Dec 16 06:58:22 crc kubenswrapper[4789]: I1216 06:58:22.216706 4789 scope.go:117] "RemoveContainer" containerID="7f35d1ba75312b9bf2723bdd6db6985398349b366a3c618459d54472e2be16b6" Dec 16 06:58:23 crc kubenswrapper[4789]: I1216 06:58:23.224455 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"d23462b9fda7f227b342d36364779a64e2d961db504204feb74ea1aa2b298979"} Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.167298 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m"] Dec 16 07:00:00 crc kubenswrapper[4789]: E1216 07:00:00.168147 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be028739-1351-4883-95ec-35fb89831c72" containerName="registry" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.168182 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="be028739-1351-4883-95ec-35fb89831c72" containerName="registry" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.168299 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="be028739-1351-4883-95ec-35fb89831c72" containerName="registry" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.168745 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.170410 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.170628 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.191723 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m"] Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.270600 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31cf204-c244-4aac-954a-9ef9222209df-secret-volume\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.271171 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6wpz\" (UniqueName: \"kubernetes.io/projected/e31cf204-c244-4aac-954a-9ef9222209df-kube-api-access-n6wpz\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.271235 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31cf204-c244-4aac-954a-9ef9222209df-config-volume\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.372807 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6wpz\" (UniqueName: \"kubernetes.io/projected/e31cf204-c244-4aac-954a-9ef9222209df-kube-api-access-n6wpz\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.372868 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31cf204-c244-4aac-954a-9ef9222209df-config-volume\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.372895 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31cf204-c244-4aac-954a-9ef9222209df-secret-volume\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.374084 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31cf204-c244-4aac-954a-9ef9222209df-config-volume\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.379063 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31cf204-c244-4aac-954a-9ef9222209df-secret-volume\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.391536 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6wpz\" (UniqueName: \"kubernetes.io/projected/e31cf204-c244-4aac-954a-9ef9222209df-kube-api-access-n6wpz\") pod \"collect-profiles-29431140-hn84m\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.495834 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.663824 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m"] Dec 16 07:00:00 crc kubenswrapper[4789]: I1216 07:00:00.715517 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" event={"ID":"e31cf204-c244-4aac-954a-9ef9222209df","Type":"ContainerStarted","Data":"72f1aa406502488d47455a71b32203c22aac449b5c2448f2d31eb724d2f978e0"} Dec 16 07:00:01 crc kubenswrapper[4789]: I1216 07:00:01.721180 4789 generic.go:334] "Generic (PLEG): container finished" podID="e31cf204-c244-4aac-954a-9ef9222209df" containerID="99adf9c5a0e69c36997dda840eed11aff22e9f0a4100d049a3c33332e3eb562e" exitCode=0 Dec 16 07:00:01 crc kubenswrapper[4789]: I1216 07:00:01.721250 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" event={"ID":"e31cf204-c244-4aac-954a-9ef9222209df","Type":"ContainerDied","Data":"99adf9c5a0e69c36997dda840eed11aff22e9f0a4100d049a3c33332e3eb562e"} Dec 16 07:00:02 crc kubenswrapper[4789]: I1216 07:00:02.313981 4789 scope.go:117] "RemoveContainer" containerID="9e8eac2b1e7f2c1ddbb9be410ad8532c433010e87cb5b6050e74cd1fe2eacaa8" Dec 16 07:00:02 crc kubenswrapper[4789]: I1216 07:00:02.331068 4789 scope.go:117] "RemoveContainer" containerID="d5997445e0f915ffcc6f95c6399f43a797da82179516d62f02ff05af3bf39f33" Dec 16 07:00:02 crc kubenswrapper[4789]: I1216 07:00:02.345885 4789 scope.go:117] "RemoveContainer" containerID="f491a6d37655c9eae25e2da1ed574968d517686d0b91eec7bd6800ba205e2595" Dec 16 07:00:02 crc kubenswrapper[4789]: I1216 07:00:02.925712 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.104656 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6wpz\" (UniqueName: \"kubernetes.io/projected/e31cf204-c244-4aac-954a-9ef9222209df-kube-api-access-n6wpz\") pod \"e31cf204-c244-4aac-954a-9ef9222209df\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.104727 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31cf204-c244-4aac-954a-9ef9222209df-config-volume\") pod \"e31cf204-c244-4aac-954a-9ef9222209df\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.104770 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31cf204-c244-4aac-954a-9ef9222209df-secret-volume\") pod \"e31cf204-c244-4aac-954a-9ef9222209df\" (UID: \"e31cf204-c244-4aac-954a-9ef9222209df\") " Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.105937 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e31cf204-c244-4aac-954a-9ef9222209df-config-volume" (OuterVolumeSpecName: "config-volume") pod "e31cf204-c244-4aac-954a-9ef9222209df" (UID: "e31cf204-c244-4aac-954a-9ef9222209df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.111528 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31cf204-c244-4aac-954a-9ef9222209df-kube-api-access-n6wpz" (OuterVolumeSpecName: "kube-api-access-n6wpz") pod "e31cf204-c244-4aac-954a-9ef9222209df" (UID: "e31cf204-c244-4aac-954a-9ef9222209df"). InnerVolumeSpecName "kube-api-access-n6wpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.115010 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31cf204-c244-4aac-954a-9ef9222209df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e31cf204-c244-4aac-954a-9ef9222209df" (UID: "e31cf204-c244-4aac-954a-9ef9222209df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.206423 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6wpz\" (UniqueName: \"kubernetes.io/projected/e31cf204-c244-4aac-954a-9ef9222209df-kube-api-access-n6wpz\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.206480 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e31cf204-c244-4aac-954a-9ef9222209df-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.206490 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e31cf204-c244-4aac-954a-9ef9222209df-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.741388 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" event={"ID":"e31cf204-c244-4aac-954a-9ef9222209df","Type":"ContainerDied","Data":"72f1aa406502488d47455a71b32203c22aac449b5c2448f2d31eb724d2f978e0"} Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.741432 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f1aa406502488d47455a71b32203c22aac449b5c2448f2d31eb724d2f978e0" Dec 16 07:00:03 crc kubenswrapper[4789]: I1216 07:00:03.741500 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m" Dec 16 07:00:51 crc kubenswrapper[4789]: I1216 07:00:51.927652 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:00:51 crc kubenswrapper[4789]: I1216 07:00:51.928145 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:01:02 crc kubenswrapper[4789]: I1216 07:01:02.375785 4789 scope.go:117] "RemoveContainer" containerID="dc68cf5edf6a22837f52641014946cbc974bddf7d3d7cffed5287128ea44d580" Dec 16 07:01:02 crc kubenswrapper[4789]: I1216 07:01:02.391514 4789 scope.go:117] "RemoveContainer" containerID="5dad657c25fa22973bb79c7f939a34692d620e7ee61100b5e98b980a968351f0" Dec 16 07:01:02 crc kubenswrapper[4789]: I1216 07:01:02.407526 4789 scope.go:117] "RemoveContainer" containerID="78c0f9d46053800a938b83570d2c3a657a36744d819e6b4c1e464ed6fbd9ec08" Dec 16 07:01:02 crc kubenswrapper[4789]: I1216 07:01:02.425306 4789 scope.go:117] "RemoveContainer" containerID="000b76426f38d969384bb9190479f5c6282b00345280198570f0dc0333fc4a75" Dec 16 07:01:21 crc kubenswrapper[4789]: I1216 07:01:21.928395 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:01:21 crc kubenswrapper[4789]: I1216 07:01:21.928968 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:01:51 crc kubenswrapper[4789]: I1216 07:01:51.928401 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:01:51 crc kubenswrapper[4789]: I1216 07:01:51.929749 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:01:51 crc kubenswrapper[4789]: I1216 07:01:51.929824 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:01:51 crc kubenswrapper[4789]: I1216 07:01:51.930421 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d23462b9fda7f227b342d36364779a64e2d961db504204feb74ea1aa2b298979"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:01:51 crc kubenswrapper[4789]: I1216 07:01:51.930469 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://d23462b9fda7f227b342d36364779a64e2d961db504204feb74ea1aa2b298979" gracePeriod=600 Dec 16 07:01:52 crc kubenswrapper[4789]: I1216 07:01:52.262543 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="d23462b9fda7f227b342d36364779a64e2d961db504204feb74ea1aa2b298979" exitCode=0 Dec 16 07:01:52 crc kubenswrapper[4789]: I1216 07:01:52.262627 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"d23462b9fda7f227b342d36364779a64e2d961db504204feb74ea1aa2b298979"} Dec 16 07:01:52 crc kubenswrapper[4789]: I1216 07:01:52.262868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"3851c1899da6a57a194edd039ca6372a9930890332c280cff4f36f157e8d3272"} Dec 16 07:01:52 crc kubenswrapper[4789]: I1216 07:01:52.262890 4789 scope.go:117] "RemoveContainer" containerID="2d12421f385572b5f49ec16ce5dc368fcce4c0b47f4845aad6327275ef658245" Dec 16 07:03:02 crc kubenswrapper[4789]: I1216 07:03:02.478024 4789 scope.go:117] "RemoveContainer" containerID="3adc11389a094fb049402fc6d63bd295fd729bf28285b84a7e8e507b240706fd" Dec 16 07:03:46 crc kubenswrapper[4789]: I1216 07:03:46.439619 4789 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 07:04:21 crc kubenswrapper[4789]: I1216 07:04:21.929184 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:04:21 crc kubenswrapper[4789]: I1216 07:04:21.929862 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.250646 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbvfm"] Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.258120 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-controller" containerID="cri-o://f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.258169 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="nbdb" containerID="cri-o://f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.258203 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="sbdb" containerID="cri-o://442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.258261 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-node" containerID="cri-o://84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.258130 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.258364 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-acl-logging" containerID="cri-o://446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.258364 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="northd" containerID="cri-o://506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.283441 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" containerID="cri-o://6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" gracePeriod=30 Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.604077 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/3.log" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.606319 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovn-acl-logging/0.log" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.606735 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovn-controller/0.log" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.607196 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656147 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lgncw"] Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656408 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="sbdb" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656427 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="sbdb" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656443 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31cf204-c244-4aac-954a-9ef9222209df" containerName="collect-profiles" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656451 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31cf204-c244-4aac-954a-9ef9222209df" containerName="collect-profiles" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656460 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-node" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656468 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-node" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656481 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656488 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656496 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656503 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656513 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="nbdb" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656520 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="nbdb" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656528 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656537 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656547 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656554 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656565 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kubecfg-setup" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656571 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kubecfg-setup" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656580 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-acl-logging" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656587 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-acl-logging" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656596 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656603 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656611 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="northd" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656618 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="northd" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656627 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656635 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656738 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="sbdb" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656752 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656761 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656770 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31cf204-c244-4aac-954a-9ef9222209df" containerName="collect-profiles" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656779 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656785 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656798 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656806 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656815 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="northd" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656823 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-acl-logging" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656832 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovn-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656842 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="nbdb" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656854 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="kube-rbac-proxy-node" Dec 16 07:04:31 crc kubenswrapper[4789]: E1216 07:04:31.656982 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.656993 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerName="ovnkube-controller" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.663715 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.690935 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-netns\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691193 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-etc-openvswitch\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691309 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-systemd\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691429 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-config\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691535 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-log-socket\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691628 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-openvswitch\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691735 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-ovn-kubernetes\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691834 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02a3f8b3-6393-4e58-9b49-506f85204b08-ovn-node-metrics-cert\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691959 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-netd\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691070 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692123 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-script-lib\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692236 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-systemd-units\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692266 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-kubelet\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692297 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blqv4\" (UniqueName: \"kubernetes.io/projected/02a3f8b3-6393-4e58-9b49-506f85204b08-kube-api-access-blqv4\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691258 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691581 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-log-socket" (OuterVolumeSpecName: "log-socket") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691717 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691845 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.691835 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692030 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692381 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692328 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692403 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692355 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-var-lib-openvswitch\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692446 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692465 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-ovn\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692485 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-slash\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692498 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-bin\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692514 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-node-log\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692545 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692576 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-env-overrides\") pod \"02a3f8b3-6393-4e58-9b49-506f85204b08\" (UID: \"02a3f8b3-6393-4e58-9b49-506f85204b08\") " Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692547 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-slash" (OuterVolumeSpecName: "host-slash") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692566 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692589 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-node-log" (OuterVolumeSpecName: "node-log") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692626 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692649 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692882 4789 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692894 4789 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692902 4789 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692928 4789 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-log-socket\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692936 4789 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692944 4789 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692952 4789 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692960 4789 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692968 4789 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692971 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.692976 4789 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.693052 4789 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.693067 4789 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.693078 4789 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-slash\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.693121 4789 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.693131 4789 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-node-log\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.693141 4789 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.696621 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a3f8b3-6393-4e58-9b49-506f85204b08-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.699237 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a3f8b3-6393-4e58-9b49-506f85204b08-kube-api-access-blqv4" (OuterVolumeSpecName: "kube-api-access-blqv4") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "kube-api-access-blqv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.703735 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "02a3f8b3-6393-4e58-9b49-506f85204b08" (UID: "02a3f8b3-6393-4e58-9b49-506f85204b08"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.794615 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-env-overrides\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795012 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-log-socket\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795096 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-ovn\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795152 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795171 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-cni-netd\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795186 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovnkube-config\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795225 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovn-node-metrics-cert\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795323 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-systemd\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795353 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovnkube-script-lib\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795396 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-slash\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795478 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-systemd-units\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795525 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-kubelet\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795570 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-cni-bin\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795601 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms8kn\" (UniqueName: \"kubernetes.io/projected/22aa29f2-d2f2-49f6-88b2-c4114f424300-kube-api-access-ms8kn\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795621 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-run-ovn-kubernetes\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795637 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795656 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-var-lib-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795670 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-etc-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795688 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-run-netns\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795725 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-node-log\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795789 4789 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02a3f8b3-6393-4e58-9b49-506f85204b08-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795802 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02a3f8b3-6393-4e58-9b49-506f85204b08-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795812 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blqv4\" (UniqueName: \"kubernetes.io/projected/02a3f8b3-6393-4e58-9b49-506f85204b08-kube-api-access-blqv4\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.795821 4789 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02a3f8b3-6393-4e58-9b49-506f85204b08-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.896930 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-cni-bin\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.896970 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms8kn\" (UniqueName: \"kubernetes.io/projected/22aa29f2-d2f2-49f6-88b2-c4114f424300-kube-api-access-ms8kn\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.896989 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-run-ovn-kubernetes\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897008 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897024 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-var-lib-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897037 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-etc-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897053 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-run-netns\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897077 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-node-log\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897093 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-env-overrides\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897107 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-log-socket\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897131 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-ovn\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897145 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897158 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-cni-netd\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897175 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovnkube-config\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897190 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovn-node-metrics-cert\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897212 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-systemd\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897232 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovnkube-script-lib\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897250 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-slash\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897267 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-systemd-units\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897284 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-kubelet\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897375 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-kubelet\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897433 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-cni-bin\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897472 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-log-socket\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897520 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-slash\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897543 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-systemd-units\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897553 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-etc-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897555 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-var-lib-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897583 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897593 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-run-ovn-kubernetes\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897597 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-run-netns\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897736 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-systemd\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897736 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-host-cni-netd\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897764 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-node-log\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897947 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-openvswitch\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.897978 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/22aa29f2-d2f2-49f6-88b2-c4114f424300-run-ovn\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.898509 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-env-overrides\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.898680 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovnkube-script-lib\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.899593 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovnkube-config\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.902205 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/22aa29f2-d2f2-49f6-88b2-c4114f424300-ovn-node-metrics-cert\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.914340 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms8kn\" (UniqueName: \"kubernetes.io/projected/22aa29f2-d2f2-49f6-88b2-c4114f424300-kube-api-access-ms8kn\") pod \"ovnkube-node-lgncw\" (UID: \"22aa29f2-d2f2-49f6-88b2-c4114f424300\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:31 crc kubenswrapper[4789]: I1216 07:04:31.977072 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.084601 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/2.log" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.085354 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/1.log" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.085396 4789 generic.go:334] "Generic (PLEG): container finished" podID="32431466-a255-4bf2-9237-4f48eab4a71e" containerID="bd434f3a0278709c2668ba4811723fb471cc6af28d94e7295ba888033dbe733f" exitCode=2 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.085424 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerDied","Data":"bd434f3a0278709c2668ba4811723fb471cc6af28d94e7295ba888033dbe733f"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.085472 4789 scope.go:117] "RemoveContainer" containerID="9446ebaebd3936c88a498f2cc8ce8b8eced25e626763499102442717fe1e307a" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.085948 4789 scope.go:117] "RemoveContainer" containerID="bd434f3a0278709c2668ba4811723fb471cc6af28d94e7295ba888033dbe733f" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.086421 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"47766f66c8610c460511f6b8040cb4577f492fe6686dc52e3d5906ace60d5fd9"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.088863 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovnkube-controller/3.log" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.092500 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovn-acl-logging/0.log" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093258 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbvfm_02a3f8b3-6393-4e58-9b49-506f85204b08/ovn-controller/0.log" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093623 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" exitCode=0 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093659 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" exitCode=0 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093669 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" exitCode=0 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093679 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" exitCode=0 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093688 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" exitCode=0 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093695 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" exitCode=0 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093703 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" exitCode=143 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093711 4789 generic.go:334] "Generic (PLEG): container finished" podID="02a3f8b3-6393-4e58-9b49-506f85204b08" containerID="f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" exitCode=143 Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093733 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093764 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093778 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093790 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093802 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093814 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093827 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093839 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093847 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093855 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093863 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093870 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093878 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093885 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093891 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093897 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093906 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093937 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093946 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093953 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093959 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093966 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093972 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093979 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.093984 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094002 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094010 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094020 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094031 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094039 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094045 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094052 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094058 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094064 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094071 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094077 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094083 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094090 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094100 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" event={"ID":"02a3f8b3-6393-4e58-9b49-506f85204b08","Type":"ContainerDied","Data":"36ff7caac3087898e4430aa6803975a6c8c89772ccd913cca511135dc7bc72ff"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094110 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094118 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094124 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094130 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094136 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094142 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094148 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094155 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094163 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094170 4789 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.094265 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbvfm" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.113963 4789 scope.go:117] "RemoveContainer" containerID="6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.133665 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbvfm"] Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.139674 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pbvfm"] Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.195208 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.213175 4789 scope.go:117] "RemoveContainer" containerID="442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.225868 4789 scope.go:117] "RemoveContainer" containerID="f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.248603 4789 scope.go:117] "RemoveContainer" containerID="506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.272563 4789 scope.go:117] "RemoveContainer" containerID="8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.288938 4789 scope.go:117] "RemoveContainer" containerID="84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.302328 4789 scope.go:117] "RemoveContainer" containerID="446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.316857 4789 scope.go:117] "RemoveContainer" containerID="f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.331462 4789 scope.go:117] "RemoveContainer" containerID="1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.344748 4789 scope.go:117] "RemoveContainer" containerID="6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.345182 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": container with ID starting with 6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2 not found: ID does not exist" containerID="6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.345214 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} err="failed to get container status \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": rpc error: code = NotFound desc = could not find container \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": container with ID starting with 6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.345241 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.345539 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": container with ID starting with 4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8 not found: ID does not exist" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.345557 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} err="failed to get container status \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": rpc error: code = NotFound desc = could not find container \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": container with ID starting with 4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.345572 4789 scope.go:117] "RemoveContainer" containerID="442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.345880 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": container with ID starting with 442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb not found: ID does not exist" containerID="442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.345928 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} err="failed to get container status \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": rpc error: code = NotFound desc = could not find container \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": container with ID starting with 442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.345947 4789 scope.go:117] "RemoveContainer" containerID="f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.346231 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": container with ID starting with f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46 not found: ID does not exist" containerID="f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.346252 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} err="failed to get container status \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": rpc error: code = NotFound desc = could not find container \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": container with ID starting with f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.346265 4789 scope.go:117] "RemoveContainer" containerID="506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.346519 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": container with ID starting with 506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff not found: ID does not exist" containerID="506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.346549 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} err="failed to get container status \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": rpc error: code = NotFound desc = could not find container \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": container with ID starting with 506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.346570 4789 scope.go:117] "RemoveContainer" containerID="8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.346885 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": container with ID starting with 8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8 not found: ID does not exist" containerID="8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.346924 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} err="failed to get container status \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": rpc error: code = NotFound desc = could not find container \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": container with ID starting with 8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.346945 4789 scope.go:117] "RemoveContainer" containerID="84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.347233 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": container with ID starting with 84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e not found: ID does not exist" containerID="84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.347274 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} err="failed to get container status \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": rpc error: code = NotFound desc = could not find container \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": container with ID starting with 84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.347305 4789 scope.go:117] "RemoveContainer" containerID="446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.347585 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": container with ID starting with 446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b not found: ID does not exist" containerID="446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.347612 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} err="failed to get container status \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": rpc error: code = NotFound desc = could not find container \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": container with ID starting with 446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.347632 4789 scope.go:117] "RemoveContainer" containerID="f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.347881 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": container with ID starting with f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d not found: ID does not exist" containerID="f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.347902 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} err="failed to get container status \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": rpc error: code = NotFound desc = could not find container \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": container with ID starting with f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.347933 4789 scope.go:117] "RemoveContainer" containerID="1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320" Dec 16 07:04:32 crc kubenswrapper[4789]: E1216 07:04:32.348156 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": container with ID starting with 1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320 not found: ID does not exist" containerID="1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.348181 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} err="failed to get container status \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": rpc error: code = NotFound desc = could not find container \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": container with ID starting with 1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.348196 4789 scope.go:117] "RemoveContainer" containerID="6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.348406 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} err="failed to get container status \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": rpc error: code = NotFound desc = could not find container \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": container with ID starting with 6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.348426 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.348810 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} err="failed to get container status \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": rpc error: code = NotFound desc = could not find container \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": container with ID starting with 4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.348826 4789 scope.go:117] "RemoveContainer" containerID="442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349098 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} err="failed to get container status \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": rpc error: code = NotFound desc = could not find container \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": container with ID starting with 442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349111 4789 scope.go:117] "RemoveContainer" containerID="f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349318 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} err="failed to get container status \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": rpc error: code = NotFound desc = could not find container \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": container with ID starting with f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349346 4789 scope.go:117] "RemoveContainer" containerID="506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349534 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} err="failed to get container status \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": rpc error: code = NotFound desc = could not find container \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": container with ID starting with 506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349560 4789 scope.go:117] "RemoveContainer" containerID="8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349749 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} err="failed to get container status \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": rpc error: code = NotFound desc = could not find container \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": container with ID starting with 8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.349773 4789 scope.go:117] "RemoveContainer" containerID="84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.350027 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} err="failed to get container status \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": rpc error: code = NotFound desc = could not find container \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": container with ID starting with 84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.350055 4789 scope.go:117] "RemoveContainer" containerID="446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.352955 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} err="failed to get container status \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": rpc error: code = NotFound desc = could not find container \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": container with ID starting with 446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.352985 4789 scope.go:117] "RemoveContainer" containerID="f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.353321 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} err="failed to get container status \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": rpc error: code = NotFound desc = could not find container \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": container with ID starting with f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.353350 4789 scope.go:117] "RemoveContainer" containerID="1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.353673 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} err="failed to get container status \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": rpc error: code = NotFound desc = could not find container \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": container with ID starting with 1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.353702 4789 scope.go:117] "RemoveContainer" containerID="6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.353938 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} err="failed to get container status \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": rpc error: code = NotFound desc = could not find container \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": container with ID starting with 6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.353958 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.354166 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} err="failed to get container status \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": rpc error: code = NotFound desc = could not find container \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": container with ID starting with 4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.354189 4789 scope.go:117] "RemoveContainer" containerID="442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.354439 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} err="failed to get container status \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": rpc error: code = NotFound desc = could not find container \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": container with ID starting with 442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.354467 4789 scope.go:117] "RemoveContainer" containerID="f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.354894 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} err="failed to get container status \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": rpc error: code = NotFound desc = could not find container \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": container with ID starting with f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.354960 4789 scope.go:117] "RemoveContainer" containerID="506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.355524 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} err="failed to get container status \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": rpc error: code = NotFound desc = could not find container \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": container with ID starting with 506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.355548 4789 scope.go:117] "RemoveContainer" containerID="8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.355871 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} err="failed to get container status \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": rpc error: code = NotFound desc = could not find container \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": container with ID starting with 8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.355901 4789 scope.go:117] "RemoveContainer" containerID="84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.356193 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} err="failed to get container status \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": rpc error: code = NotFound desc = could not find container \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": container with ID starting with 84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.356219 4789 scope.go:117] "RemoveContainer" containerID="446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.356383 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} err="failed to get container status \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": rpc error: code = NotFound desc = could not find container \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": container with ID starting with 446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.356406 4789 scope.go:117] "RemoveContainer" containerID="f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.356601 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} err="failed to get container status \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": rpc error: code = NotFound desc = could not find container \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": container with ID starting with f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.356626 4789 scope.go:117] "RemoveContainer" containerID="1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.357030 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} err="failed to get container status \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": rpc error: code = NotFound desc = could not find container \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": container with ID starting with 1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.357053 4789 scope.go:117] "RemoveContainer" containerID="6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.358004 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2"} err="failed to get container status \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": rpc error: code = NotFound desc = could not find container \"6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2\": container with ID starting with 6eb7f114e59426486860858942501bac0332134037e85ade00cb0b1d399dd4b2 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.358046 4789 scope.go:117] "RemoveContainer" containerID="4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.358356 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8"} err="failed to get container status \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": rpc error: code = NotFound desc = could not find container \"4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8\": container with ID starting with 4a0c24b0de17fc94674f2bc82600529cf3993e75cbc1033508dc5cd3e5f5bac8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.358376 4789 scope.go:117] "RemoveContainer" containerID="442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.358948 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb"} err="failed to get container status \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": rpc error: code = NotFound desc = could not find container \"442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb\": container with ID starting with 442b712677e91c3aa1367d36d93bd65fd988f860d7a98bffafa20d320e8c2bfb not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.358969 4789 scope.go:117] "RemoveContainer" containerID="f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.359284 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46"} err="failed to get container status \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": rpc error: code = NotFound desc = could not find container \"f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46\": container with ID starting with f8bcec82341d9dce8e98875af03873f0ed9455dc1492d694b76623b020339d46 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.359312 4789 scope.go:117] "RemoveContainer" containerID="506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.359603 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff"} err="failed to get container status \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": rpc error: code = NotFound desc = could not find container \"506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff\": container with ID starting with 506313c7f52b54da7e4d51ea6f2cbe4048d5d7b5424c3113f734e5be747668ff not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.359630 4789 scope.go:117] "RemoveContainer" containerID="8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.359921 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8"} err="failed to get container status \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": rpc error: code = NotFound desc = could not find container \"8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8\": container with ID starting with 8cbcc3d2ea346f35ebca3675c7f2637b2ff9c60e5250c65fe3583772178e59a8 not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.359948 4789 scope.go:117] "RemoveContainer" containerID="84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.360250 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e"} err="failed to get container status \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": rpc error: code = NotFound desc = could not find container \"84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e\": container with ID starting with 84c3ba0ee34070dddfd6b12d0899ca3143ca972fc9e765d8dc2fffe335fe365e not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.360271 4789 scope.go:117] "RemoveContainer" containerID="446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.360471 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b"} err="failed to get container status \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": rpc error: code = NotFound desc = could not find container \"446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b\": container with ID starting with 446152e6fca3a2b8fab42ced7403f09ef14ed39575c5a32acfe10f9af7fda26b not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.360491 4789 scope.go:117] "RemoveContainer" containerID="f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.360844 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d"} err="failed to get container status \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": rpc error: code = NotFound desc = could not find container \"f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d\": container with ID starting with f5ebf21db2a0bdfdeb3750487482de903555a77faf362eff5090dea0cb1dba1d not found: ID does not exist" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.360870 4789 scope.go:117] "RemoveContainer" containerID="1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320" Dec 16 07:04:32 crc kubenswrapper[4789]: I1216 07:04:32.361135 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320"} err="failed to get container status \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": rpc error: code = NotFound desc = could not find container \"1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320\": container with ID starting with 1031b524c3298f7d528274ecf015fdc2786eaf8c29391d8a343c2c72e9667320 not found: ID does not exist" Dec 16 07:04:33 crc kubenswrapper[4789]: I1216 07:04:33.101720 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58dsj_32431466-a255-4bf2-9237-4f48eab4a71e/kube-multus/2.log" Dec 16 07:04:33 crc kubenswrapper[4789]: I1216 07:04:33.101992 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58dsj" event={"ID":"32431466-a255-4bf2-9237-4f48eab4a71e","Type":"ContainerStarted","Data":"920a5a4b659945f8bcd6a7ba74729a87c6c8c819b1f5ecfa75a444852b953382"} Dec 16 07:04:33 crc kubenswrapper[4789]: I1216 07:04:33.103683 4789 generic.go:334] "Generic (PLEG): container finished" podID="22aa29f2-d2f2-49f6-88b2-c4114f424300" containerID="755e3dd4b3da288fa75374331c2c7faaeed352c17de553159d85780213d0ad10" exitCode=0 Dec 16 07:04:33 crc kubenswrapper[4789]: I1216 07:04:33.103723 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerDied","Data":"755e3dd4b3da288fa75374331c2c7faaeed352c17de553159d85780213d0ad10"} Dec 16 07:04:34 crc kubenswrapper[4789]: I1216 07:04:34.114857 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a3f8b3-6393-4e58-9b49-506f85204b08" path="/var/lib/kubelet/pods/02a3f8b3-6393-4e58-9b49-506f85204b08/volumes" Dec 16 07:04:34 crc kubenswrapper[4789]: I1216 07:04:34.116511 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"1efa8af4ffae0fb79ee49025ec7197ce885cea61c3ebd05b7ee5114087c30f51"} Dec 16 07:04:34 crc kubenswrapper[4789]: I1216 07:04:34.116540 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"8f6a0533c13529e3abf12ed540ae07c8357d8658664d534e1a426111684021e9"} Dec 16 07:04:34 crc kubenswrapper[4789]: I1216 07:04:34.116553 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"7785f1bba5ab97506e165827ee96653ec33f685619aa6a4a29143b312239b876"} Dec 16 07:04:34 crc kubenswrapper[4789]: I1216 07:04:34.116562 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"488cc2fa8ffa6cb78ac1c052387ae07e06446b37c0c087b6ef71d7435267fe93"} Dec 16 07:04:34 crc kubenswrapper[4789]: I1216 07:04:34.116571 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"48c75b15cfe954b89314ed079c9863153fd9ab87504be347e5b67bd9d9d69f84"} Dec 16 07:04:34 crc kubenswrapper[4789]: I1216 07:04:34.116581 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"61a99923bd14f5abd65657740b7d36fb1fe81386ce3b4563bcd9ccba1e36b617"} Dec 16 07:04:36 crc kubenswrapper[4789]: I1216 07:04:36.126848 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"73aed9de44855cc1701326261c42c025db50061b2070041f20b6bad22fd3dca8"} Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.617259 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-f9mt2"] Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.618153 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.619816 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.619976 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.620018 4789 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wmbjf" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.620198 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.658191 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-node-mnt\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.658254 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzkz\" (UniqueName: \"kubernetes.io/projected/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-kube-api-access-6bzkz\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.658296 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-crc-storage\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.759453 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-crc-storage\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.759560 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-node-mnt\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.759615 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzkz\" (UniqueName: \"kubernetes.io/projected/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-kube-api-access-6bzkz\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.759883 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-node-mnt\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.760125 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-crc-storage\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.781622 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzkz\" (UniqueName: \"kubernetes.io/projected/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-kube-api-access-6bzkz\") pod \"crc-storage-crc-f9mt2\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: I1216 07:04:37.933330 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: E1216 07:04:37.955625 4789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(729b7b5ebd461e73b0d0f297fbe174e6ddccca7d2ada3fe80707490f3a7d5198): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 07:04:37 crc kubenswrapper[4789]: E1216 07:04:37.956549 4789 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(729b7b5ebd461e73b0d0f297fbe174e6ddccca7d2ada3fe80707490f3a7d5198): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: E1216 07:04:37.956586 4789 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(729b7b5ebd461e73b0d0f297fbe174e6ddccca7d2ada3fe80707490f3a7d5198): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:37 crc kubenswrapper[4789]: E1216 07:04:37.956662 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-f9mt2_crc-storage(6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-f9mt2_crc-storage(6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(729b7b5ebd461e73b0d0f297fbe174e6ddccca7d2ada3fe80707490f3a7d5198): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-f9mt2" podUID="6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" Dec 16 07:04:39 crc kubenswrapper[4789]: I1216 07:04:39.142609 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" event={"ID":"22aa29f2-d2f2-49f6-88b2-c4114f424300","Type":"ContainerStarted","Data":"e141691bc4a32169c0f1e48e36ec5af510337ddaa62e33a89262fbf5413ff371"} Dec 16 07:04:39 crc kubenswrapper[4789]: I1216 07:04:39.675143 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-f9mt2"] Dec 16 07:04:39 crc kubenswrapper[4789]: I1216 07:04:39.675255 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:39 crc kubenswrapper[4789]: I1216 07:04:39.675624 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:39 crc kubenswrapper[4789]: E1216 07:04:39.708945 4789 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(fdec7957383e7e3c31ee9776aa869dfa82ba20dfea5da02a746288e27812103b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 07:04:39 crc kubenswrapper[4789]: E1216 07:04:39.709413 4789 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(fdec7957383e7e3c31ee9776aa869dfa82ba20dfea5da02a746288e27812103b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:39 crc kubenswrapper[4789]: E1216 07:04:39.709468 4789 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(fdec7957383e7e3c31ee9776aa869dfa82ba20dfea5da02a746288e27812103b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:39 crc kubenswrapper[4789]: E1216 07:04:39.709551 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-f9mt2_crc-storage(6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-f9mt2_crc-storage(6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-f9mt2_crc-storage_6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889_0(fdec7957383e7e3c31ee9776aa869dfa82ba20dfea5da02a746288e27812103b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-f9mt2" podUID="6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" Dec 16 07:04:40 crc kubenswrapper[4789]: I1216 07:04:40.147460 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:40 crc kubenswrapper[4789]: I1216 07:04:40.147509 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:40 crc kubenswrapper[4789]: I1216 07:04:40.147518 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:40 crc kubenswrapper[4789]: I1216 07:04:40.177287 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:40 crc kubenswrapper[4789]: I1216 07:04:40.179633 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" podStartSLOduration=9.179611459 podStartE2EDuration="9.179611459s" podCreationTimestamp="2025-12-16 07:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:04:40.176806 +0000 UTC m=+818.438693649" watchObservedRunningTime="2025-12-16 07:04:40.179611459 +0000 UTC m=+818.441499088" Dec 16 07:04:40 crc kubenswrapper[4789]: I1216 07:04:40.188295 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:04:51 crc kubenswrapper[4789]: I1216 07:04:51.928432 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:04:51 crc kubenswrapper[4789]: I1216 07:04:51.929045 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:04:54 crc kubenswrapper[4789]: I1216 07:04:54.104262 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:54 crc kubenswrapper[4789]: I1216 07:04:54.105554 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:54 crc kubenswrapper[4789]: I1216 07:04:54.520961 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-f9mt2"] Dec 16 07:04:54 crc kubenswrapper[4789]: I1216 07:04:54.547153 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:04:55 crc kubenswrapper[4789]: I1216 07:04:55.234125 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f9mt2" event={"ID":"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889","Type":"ContainerStarted","Data":"2ff842efea1441bdc98fb3faddcc00593131266486e8172434875c0cc31677ea"} Dec 16 07:04:56 crc kubenswrapper[4789]: I1216 07:04:56.244993 4789 generic.go:334] "Generic (PLEG): container finished" podID="6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" containerID="4671e6c07c62e8f147205567e01189d3299a3e899d46fe76d7ac1f98bcf71f74" exitCode=0 Dec 16 07:04:56 crc kubenswrapper[4789]: I1216 07:04:56.245086 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f9mt2" event={"ID":"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889","Type":"ContainerDied","Data":"4671e6c07c62e8f147205567e01189d3299a3e899d46fe76d7ac1f98bcf71f74"} Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.529027 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.625507 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bzkz\" (UniqueName: \"kubernetes.io/projected/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-kube-api-access-6bzkz\") pod \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.625566 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-node-mnt\") pod \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.625648 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-crc-storage\") pod \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\" (UID: \"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889\") " Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.625749 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" (UID: "6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.626317 4789 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.629730 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-kube-api-access-6bzkz" (OuterVolumeSpecName: "kube-api-access-6bzkz") pod "6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" (UID: "6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889"). InnerVolumeSpecName "kube-api-access-6bzkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.637701 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" (UID: "6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.727347 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bzkz\" (UniqueName: \"kubernetes.io/projected/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-kube-api-access-6bzkz\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:57 crc kubenswrapper[4789]: I1216 07:04:57.727394 4789 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 16 07:04:58 crc kubenswrapper[4789]: I1216 07:04:58.255519 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-f9mt2" event={"ID":"6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889","Type":"ContainerDied","Data":"2ff842efea1441bdc98fb3faddcc00593131266486e8172434875c0cc31677ea"} Dec 16 07:04:58 crc kubenswrapper[4789]: I1216 07:04:58.255558 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ff842efea1441bdc98fb3faddcc00593131266486e8172434875c0cc31677ea" Dec 16 07:04:58 crc kubenswrapper[4789]: I1216 07:04:58.255580 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-f9mt2" Dec 16 07:05:02 crc kubenswrapper[4789]: I1216 07:05:02.009285 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lgncw" Dec 16 07:05:04 crc kubenswrapper[4789]: I1216 07:05:04.896601 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64"] Dec 16 07:05:04 crc kubenswrapper[4789]: E1216 07:05:04.897329 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" containerName="storage" Dec 16 07:05:04 crc kubenswrapper[4789]: I1216 07:05:04.897343 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" containerName="storage" Dec 16 07:05:04 crc kubenswrapper[4789]: I1216 07:05:04.897440 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" containerName="storage" Dec 16 07:05:04 crc kubenswrapper[4789]: I1216 07:05:04.898122 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:04 crc kubenswrapper[4789]: I1216 07:05:04.899945 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 07:05:04 crc kubenswrapper[4789]: I1216 07:05:04.909803 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64"] Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.023889 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.024027 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhxgh\" (UniqueName: \"kubernetes.io/projected/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-kube-api-access-qhxgh\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.024071 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.125138 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhxgh\" (UniqueName: \"kubernetes.io/projected/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-kube-api-access-qhxgh\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.125177 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.125249 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.125617 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.126310 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.144308 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhxgh\" (UniqueName: \"kubernetes.io/projected/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-kube-api-access-qhxgh\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.211890 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:05 crc kubenswrapper[4789]: I1216 07:05:05.389900 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64"] Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.299845 4789 generic.go:334] "Generic (PLEG): container finished" podID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerID="4f3a3373fce1114fc5e276fe8b7a66c6617e0a2482eb549baf542c7089b9e27d" exitCode=0 Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.299883 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" event={"ID":"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127","Type":"ContainerDied","Data":"4f3a3373fce1114fc5e276fe8b7a66c6617e0a2482eb549baf542c7089b9e27d"} Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.299926 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" event={"ID":"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127","Type":"ContainerStarted","Data":"8c4bc93560d59ad50e828d404d053650c19dbdc6e8fa391573de62d568d3dd7c"} Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.851775 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7ffk"] Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.853215 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.864367 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7ffk"] Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.944934 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbfmk\" (UniqueName: \"kubernetes.io/projected/33253be4-48c8-45fa-94d2-1f58434c8a7f-kube-api-access-kbfmk\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.945008 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-catalog-content\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:06 crc kubenswrapper[4789]: I1216 07:05:06.945032 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-utilities\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.046489 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbfmk\" (UniqueName: \"kubernetes.io/projected/33253be4-48c8-45fa-94d2-1f58434c8a7f-kube-api-access-kbfmk\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.046565 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-catalog-content\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.046591 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-utilities\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.047061 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-utilities\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.047571 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-catalog-content\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.067993 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbfmk\" (UniqueName: \"kubernetes.io/projected/33253be4-48c8-45fa-94d2-1f58434c8a7f-kube-api-access-kbfmk\") pod \"redhat-operators-g7ffk\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.169588 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:07 crc kubenswrapper[4789]: I1216 07:05:07.376017 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7ffk"] Dec 16 07:05:07 crc kubenswrapper[4789]: W1216 07:05:07.382452 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33253be4_48c8_45fa_94d2_1f58434c8a7f.slice/crio-07cdd563a880bc2edb24b6c662fffea29e9cc2626c25bd59dff91a2eac6d152c WatchSource:0}: Error finding container 07cdd563a880bc2edb24b6c662fffea29e9cc2626c25bd59dff91a2eac6d152c: Status 404 returned error can't find the container with id 07cdd563a880bc2edb24b6c662fffea29e9cc2626c25bd59dff91a2eac6d152c Dec 16 07:05:08 crc kubenswrapper[4789]: I1216 07:05:08.314181 4789 generic.go:334] "Generic (PLEG): container finished" podID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerID="6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1" exitCode=0 Dec 16 07:05:08 crc kubenswrapper[4789]: I1216 07:05:08.314223 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ffk" event={"ID":"33253be4-48c8-45fa-94d2-1f58434c8a7f","Type":"ContainerDied","Data":"6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1"} Dec 16 07:05:08 crc kubenswrapper[4789]: I1216 07:05:08.314736 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ffk" event={"ID":"33253be4-48c8-45fa-94d2-1f58434c8a7f","Type":"ContainerStarted","Data":"07cdd563a880bc2edb24b6c662fffea29e9cc2626c25bd59dff91a2eac6d152c"} Dec 16 07:05:08 crc kubenswrapper[4789]: I1216 07:05:08.318133 4789 generic.go:334] "Generic (PLEG): container finished" podID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerID="3dea42c9fc702aaa5fbfbc780dcd2ed4164529ad7b48837ec1f60895523cadab" exitCode=0 Dec 16 07:05:08 crc kubenswrapper[4789]: I1216 07:05:08.318203 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" event={"ID":"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127","Type":"ContainerDied","Data":"3dea42c9fc702aaa5fbfbc780dcd2ed4164529ad7b48837ec1f60895523cadab"} Dec 16 07:05:09 crc kubenswrapper[4789]: I1216 07:05:09.327354 4789 generic.go:334] "Generic (PLEG): container finished" podID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerID="d339b3a0663819480cf545a65fc5d3eacda7722f39db14f05e9a9cbe04f34ad7" exitCode=0 Dec 16 07:05:09 crc kubenswrapper[4789]: I1216 07:05:09.327402 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" event={"ID":"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127","Type":"ContainerDied","Data":"d339b3a0663819480cf545a65fc5d3eacda7722f39db14f05e9a9cbe04f34ad7"} Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.335631 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ffk" event={"ID":"33253be4-48c8-45fa-94d2-1f58434c8a7f","Type":"ContainerStarted","Data":"e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832"} Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.570997 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.584265 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhxgh\" (UniqueName: \"kubernetes.io/projected/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-kube-api-access-qhxgh\") pod \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.584300 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-util\") pod \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.584326 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-bundle\") pod \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\" (UID: \"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127\") " Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.585089 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-bundle" (OuterVolumeSpecName: "bundle") pod "1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" (UID: "1dfc67d5-cfb7-4210-8d0a-0b1e87e77127"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.590063 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-kube-api-access-qhxgh" (OuterVolumeSpecName: "kube-api-access-qhxgh") pod "1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" (UID: "1dfc67d5-cfb7-4210-8d0a-0b1e87e77127"). InnerVolumeSpecName "kube-api-access-qhxgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.602502 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-util" (OuterVolumeSpecName: "util") pod "1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" (UID: "1dfc67d5-cfb7-4210-8d0a-0b1e87e77127"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.686234 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhxgh\" (UniqueName: \"kubernetes.io/projected/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-kube-api-access-qhxgh\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.686262 4789 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:10 crc kubenswrapper[4789]: I1216 07:05:10.686271 4789 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1dfc67d5-cfb7-4210-8d0a-0b1e87e77127-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:11 crc kubenswrapper[4789]: I1216 07:05:11.342404 4789 generic.go:334] "Generic (PLEG): container finished" podID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerID="e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832" exitCode=0 Dec 16 07:05:11 crc kubenswrapper[4789]: I1216 07:05:11.342652 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ffk" event={"ID":"33253be4-48c8-45fa-94d2-1f58434c8a7f","Type":"ContainerDied","Data":"e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832"} Dec 16 07:05:11 crc kubenswrapper[4789]: I1216 07:05:11.348357 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" event={"ID":"1dfc67d5-cfb7-4210-8d0a-0b1e87e77127","Type":"ContainerDied","Data":"8c4bc93560d59ad50e828d404d053650c19dbdc6e8fa391573de62d568d3dd7c"} Dec 16 07:05:11 crc kubenswrapper[4789]: I1216 07:05:11.348405 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c4bc93560d59ad50e828d404d053650c19dbdc6e8fa391573de62d568d3dd7c" Dec 16 07:05:11 crc kubenswrapper[4789]: I1216 07:05:11.348571 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.355311 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ffk" event={"ID":"33253be4-48c8-45fa-94d2-1f58434c8a7f","Type":"ContainerStarted","Data":"e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df"} Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.378253 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7ffk" podStartSLOduration=2.922033828 podStartE2EDuration="6.378237198s" podCreationTimestamp="2025-12-16 07:05:06 +0000 UTC" firstStartedPulling="2025-12-16 07:05:08.315968347 +0000 UTC m=+846.577855976" lastFinishedPulling="2025-12-16 07:05:11.772171727 +0000 UTC m=+850.034059346" observedRunningTime="2025-12-16 07:05:12.376815652 +0000 UTC m=+850.638703291" watchObservedRunningTime="2025-12-16 07:05:12.378237198 +0000 UTC m=+850.640124817" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.966790 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-cvp25"] Dec 16 07:05:12 crc kubenswrapper[4789]: E1216 07:05:12.967321 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerName="util" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.967335 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerName="util" Dec 16 07:05:12 crc kubenswrapper[4789]: E1216 07:05:12.967354 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerName="extract" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.967360 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerName="extract" Dec 16 07:05:12 crc kubenswrapper[4789]: E1216 07:05:12.967373 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerName="pull" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.967379 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerName="pull" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.967471 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dfc67d5-cfb7-4210-8d0a-0b1e87e77127" containerName="extract" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.967823 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.972517 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.973018 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-sjz5g" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.973078 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 16 07:05:12 crc kubenswrapper[4789]: I1216 07:05:12.984049 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-cvp25"] Dec 16 07:05:13 crc kubenswrapper[4789]: I1216 07:05:13.014548 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6rlw\" (UniqueName: \"kubernetes.io/projected/31c10535-b6da-4119-a311-1065b2bcb324-kube-api-access-k6rlw\") pod \"nmstate-operator-6769fb99d-cvp25\" (UID: \"31c10535-b6da-4119-a311-1065b2bcb324\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" Dec 16 07:05:13 crc kubenswrapper[4789]: I1216 07:05:13.115837 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6rlw\" (UniqueName: \"kubernetes.io/projected/31c10535-b6da-4119-a311-1065b2bcb324-kube-api-access-k6rlw\") pod \"nmstate-operator-6769fb99d-cvp25\" (UID: \"31c10535-b6da-4119-a311-1065b2bcb324\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" Dec 16 07:05:13 crc kubenswrapper[4789]: I1216 07:05:13.145941 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6rlw\" (UniqueName: \"kubernetes.io/projected/31c10535-b6da-4119-a311-1065b2bcb324-kube-api-access-k6rlw\") pod \"nmstate-operator-6769fb99d-cvp25\" (UID: \"31c10535-b6da-4119-a311-1065b2bcb324\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" Dec 16 07:05:13 crc kubenswrapper[4789]: I1216 07:05:13.282874 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" Dec 16 07:05:13 crc kubenswrapper[4789]: I1216 07:05:13.520316 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-cvp25"] Dec 16 07:05:14 crc kubenswrapper[4789]: I1216 07:05:14.374759 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" event={"ID":"31c10535-b6da-4119-a311-1065b2bcb324","Type":"ContainerStarted","Data":"907ae84c9a32519f37f4217fc8200dcbe8c2f1246a455beac0307e5b4ce4c58e"} Dec 16 07:05:16 crc kubenswrapper[4789]: I1216 07:05:16.385509 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" event={"ID":"31c10535-b6da-4119-a311-1065b2bcb324","Type":"ContainerStarted","Data":"6cbaa301100f5555f110f56dd7838111951ec23e08d6748c3260fd71502c73a8"} Dec 16 07:05:16 crc kubenswrapper[4789]: I1216 07:05:16.403556 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-cvp25" podStartSLOduration=1.8740892329999999 podStartE2EDuration="4.403536119s" podCreationTimestamp="2025-12-16 07:05:12 +0000 UTC" firstStartedPulling="2025-12-16 07:05:13.528137153 +0000 UTC m=+851.790024782" lastFinishedPulling="2025-12-16 07:05:16.057584039 +0000 UTC m=+854.319471668" observedRunningTime="2025-12-16 07:05:16.400994474 +0000 UTC m=+854.662882103" watchObservedRunningTime="2025-12-16 07:05:16.403536119 +0000 UTC m=+854.665423748" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.170001 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.171877 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.429077 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.430366 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.432629 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-97fdf" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.438818 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-pl68l"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.439516 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.441562 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.444514 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.466822 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bfwks"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.467542 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.467876 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3e603c6-bac1-496b-bf35-1e8124144121-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-pl68l\" (UID: \"d3e603c6-bac1-496b-bf35-1e8124144121\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.468102 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z56wt\" (UniqueName: \"kubernetes.io/projected/ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28-kube-api-access-z56wt\") pod \"nmstate-metrics-7f7f7578db-t4fl7\" (UID: \"ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.468884 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwz2v\" (UniqueName: \"kubernetes.io/projected/d3e603c6-bac1-496b-bf35-1e8124144121-kube-api-access-rwz2v\") pod \"nmstate-webhook-f8fb84555-pl68l\" (UID: \"d3e603c6-bac1-496b-bf35-1e8124144121\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.481415 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-pl68l"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.569440 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.569621 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-ovs-socket\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.569674 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3e603c6-bac1-496b-bf35-1e8124144121-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-pl68l\" (UID: \"d3e603c6-bac1-496b-bf35-1e8124144121\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.569712 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z56wt\" (UniqueName: \"kubernetes.io/projected/ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28-kube-api-access-z56wt\") pod \"nmstate-metrics-7f7f7578db-t4fl7\" (UID: \"ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.569750 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-dbus-socket\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.569776 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmml\" (UniqueName: \"kubernetes.io/projected/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-kube-api-access-qbmml\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.569793 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-nmstate-lock\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: E1216 07:05:17.569964 4789 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 16 07:05:17 crc kubenswrapper[4789]: E1216 07:05:17.570044 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e603c6-bac1-496b-bf35-1e8124144121-tls-key-pair podName:d3e603c6-bac1-496b-bf35-1e8124144121 nodeName:}" failed. No retries permitted until 2025-12-16 07:05:18.070025631 +0000 UTC m=+856.331913260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/d3e603c6-bac1-496b-bf35-1e8124144121-tls-key-pair") pod "nmstate-webhook-f8fb84555-pl68l" (UID: "d3e603c6-bac1-496b-bf35-1e8124144121") : secret "openshift-nmstate-webhook" not found Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.570070 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwz2v\" (UniqueName: \"kubernetes.io/projected/d3e603c6-bac1-496b-bf35-1e8124144121-kube-api-access-rwz2v\") pod \"nmstate-webhook-f8fb84555-pl68l\" (UID: \"d3e603c6-bac1-496b-bf35-1e8124144121\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.570251 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.572097 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.572461 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.572824 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-st875" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.586704 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.595731 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwz2v\" (UniqueName: \"kubernetes.io/projected/d3e603c6-bac1-496b-bf35-1e8124144121-kube-api-access-rwz2v\") pod \"nmstate-webhook-f8fb84555-pl68l\" (UID: \"d3e603c6-bac1-496b-bf35-1e8124144121\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.598938 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z56wt\" (UniqueName: \"kubernetes.io/projected/ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28-kube-api-access-z56wt\") pod \"nmstate-metrics-7f7f7578db-t4fl7\" (UID: \"ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671072 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-ovs-socket\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671157 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-dbus-socket\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671168 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-ovs-socket\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671183 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-nmstate-lock\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671234 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-nmstate-lock\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671259 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmml\" (UniqueName: \"kubernetes.io/projected/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-kube-api-access-qbmml\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671434 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-dbus-socket\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671571 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4xz\" (UniqueName: \"kubernetes.io/projected/312b3314-cd6b-422b-910e-9fdf5df3d594-kube-api-access-kv4xz\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671630 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/312b3314-cd6b-422b-910e-9fdf5df3d594-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.671696 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/312b3314-cd6b-422b-910e-9fdf5df3d594-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.698598 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmml\" (UniqueName: \"kubernetes.io/projected/2febe2b7-c4da-4fca-bb73-6e4e3bc19c36-kube-api-access-qbmml\") pod \"nmstate-handler-bfwks\" (UID: \"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36\") " pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.738319 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d7d67469c-bcmqz"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.739149 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.747841 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.752823 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d7d67469c-bcmqz"] Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773666 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8k4b\" (UniqueName: \"kubernetes.io/projected/518f8482-fa37-47b9-9a5f-4c80894ce2cd-kube-api-access-n8k4b\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773710 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-oauth-config\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773729 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-oauth-serving-cert\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773825 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-serving-cert\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773842 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-service-ca\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773868 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-config\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773894 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4xz\" (UniqueName: \"kubernetes.io/projected/312b3314-cd6b-422b-910e-9fdf5df3d594-kube-api-access-kv4xz\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773942 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-trusted-ca-bundle\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.773967 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/312b3314-cd6b-422b-910e-9fdf5df3d594-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.774016 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/312b3314-cd6b-422b-910e-9fdf5df3d594-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: E1216 07:05:17.774155 4789 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 16 07:05:17 crc kubenswrapper[4789]: E1216 07:05:17.774209 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/312b3314-cd6b-422b-910e-9fdf5df3d594-plugin-serving-cert podName:312b3314-cd6b-422b-910e-9fdf5df3d594 nodeName:}" failed. No retries permitted until 2025-12-16 07:05:18.274190055 +0000 UTC m=+856.536077684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/312b3314-cd6b-422b-910e-9fdf5df3d594-plugin-serving-cert") pod "nmstate-console-plugin-6ff7998486-2gnhj" (UID: "312b3314-cd6b-422b-910e-9fdf5df3d594") : secret "plugin-serving-cert" not found Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.775567 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/312b3314-cd6b-422b-910e-9fdf5df3d594-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.785086 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.792275 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4xz\" (UniqueName: \"kubernetes.io/projected/312b3314-cd6b-422b-910e-9fdf5df3d594-kube-api-access-kv4xz\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:17 crc kubenswrapper[4789]: W1216 07:05:17.819007 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2febe2b7_c4da_4fca_bb73_6e4e3bc19c36.slice/crio-2d6b791d649a9be674586ae0577802a96b29c49958e96bf2be97d0cc2b4f0a21 WatchSource:0}: Error finding container 2d6b791d649a9be674586ae0577802a96b29c49958e96bf2be97d0cc2b4f0a21: Status 404 returned error can't find the container with id 2d6b791d649a9be674586ae0577802a96b29c49958e96bf2be97d0cc2b4f0a21 Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.874357 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-serving-cert\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.874386 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-service-ca\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.874402 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-config\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.874425 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-trusted-ca-bundle\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.874483 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8k4b\" (UniqueName: \"kubernetes.io/projected/518f8482-fa37-47b9-9a5f-4c80894ce2cd-kube-api-access-n8k4b\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.874502 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-oauth-config\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.874520 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-oauth-serving-cert\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.875243 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-service-ca\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.876146 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-oauth-serving-cert\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.877100 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-trusted-ca-bundle\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.878156 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-config\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.880032 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-serving-cert\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.880533 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/518f8482-fa37-47b9-9a5f-4c80894ce2cd-console-oauth-config\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.891428 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8k4b\" (UniqueName: \"kubernetes.io/projected/518f8482-fa37-47b9-9a5f-4c80894ce2cd-kube-api-access-n8k4b\") pod \"console-6d7d67469c-bcmqz\" (UID: \"518f8482-fa37-47b9-9a5f-4c80894ce2cd\") " pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:17 crc kubenswrapper[4789]: I1216 07:05:17.984447 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7"] Dec 16 07:05:17 crc kubenswrapper[4789]: W1216 07:05:17.985428 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff37692e_6be8_4ebb_b3fe_1a58fcb4ac28.slice/crio-9390a53c522d701bbd2fa778dc7ef5d0808a0102a8f13ef0e409297f08e52b82 WatchSource:0}: Error finding container 9390a53c522d701bbd2fa778dc7ef5d0808a0102a8f13ef0e409297f08e52b82: Status 404 returned error can't find the container with id 9390a53c522d701bbd2fa778dc7ef5d0808a0102a8f13ef0e409297f08e52b82 Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.056802 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.076620 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3e603c6-bac1-496b-bf35-1e8124144121-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-pl68l\" (UID: \"d3e603c6-bac1-496b-bf35-1e8124144121\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.088024 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d3e603c6-bac1-496b-bf35-1e8124144121-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-pl68l\" (UID: \"d3e603c6-bac1-496b-bf35-1e8124144121\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.208668 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7ffk" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="registry-server" probeResult="failure" output=< Dec 16 07:05:18 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 07:05:18 crc kubenswrapper[4789]: > Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.232456 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d7d67469c-bcmqz"] Dec 16 07:05:18 crc kubenswrapper[4789]: W1216 07:05:18.237605 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod518f8482_fa37_47b9_9a5f_4c80894ce2cd.slice/crio-fde3cd203b989efc0b2767cd458bca4e8e421e5f1b9ff5de0c107babd598e4df WatchSource:0}: Error finding container fde3cd203b989efc0b2767cd458bca4e8e421e5f1b9ff5de0c107babd598e4df: Status 404 returned error can't find the container with id fde3cd203b989efc0b2767cd458bca4e8e421e5f1b9ff5de0c107babd598e4df Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.278069 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/312b3314-cd6b-422b-910e-9fdf5df3d594-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.282780 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/312b3314-cd6b-422b-910e-9fdf5df3d594-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-2gnhj\" (UID: \"312b3314-cd6b-422b-910e-9fdf5df3d594\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.361091 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.400392 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d7d67469c-bcmqz" event={"ID":"518f8482-fa37-47b9-9a5f-4c80894ce2cd","Type":"ContainerStarted","Data":"81006f4994bde0ba191016d6e813adb19e5efe326eaf1fc0431951f9ccdd3897"} Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.400439 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d7d67469c-bcmqz" event={"ID":"518f8482-fa37-47b9-9a5f-4c80894ce2cd","Type":"ContainerStarted","Data":"fde3cd203b989efc0b2767cd458bca4e8e421e5f1b9ff5de0c107babd598e4df"} Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.402462 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bfwks" event={"ID":"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36","Type":"ContainerStarted","Data":"2d6b791d649a9be674586ae0577802a96b29c49958e96bf2be97d0cc2b4f0a21"} Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.403664 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" event={"ID":"ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28","Type":"ContainerStarted","Data":"9390a53c522d701bbd2fa778dc7ef5d0808a0102a8f13ef0e409297f08e52b82"} Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.421498 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d7d67469c-bcmqz" podStartSLOduration=1.421470872 podStartE2EDuration="1.421470872s" podCreationTimestamp="2025-12-16 07:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:05:18.416680372 +0000 UTC m=+856.678568011" watchObservedRunningTime="2025-12-16 07:05:18.421470872 +0000 UTC m=+856.683358531" Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.534511 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-pl68l"] Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.536745 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" Dec 16 07:05:18 crc kubenswrapper[4789]: W1216 07:05:18.541488 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e603c6_bac1_496b_bf35_1e8124144121.slice/crio-31e9f2e549997715b462155f57b83c5f45751eebf77f70b1bdc06dbd64ccf553 WatchSource:0}: Error finding container 31e9f2e549997715b462155f57b83c5f45751eebf77f70b1bdc06dbd64ccf553: Status 404 returned error can't find the container with id 31e9f2e549997715b462155f57b83c5f45751eebf77f70b1bdc06dbd64ccf553 Dec 16 07:05:18 crc kubenswrapper[4789]: I1216 07:05:18.988966 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj"] Dec 16 07:05:18 crc kubenswrapper[4789]: W1216 07:05:18.997274 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312b3314_cd6b_422b_910e_9fdf5df3d594.slice/crio-5eafc6eb713994235ed8315a7970e52a9586451f7672e8414b13c8e7fe445ebc WatchSource:0}: Error finding container 5eafc6eb713994235ed8315a7970e52a9586451f7672e8414b13c8e7fe445ebc: Status 404 returned error can't find the container with id 5eafc6eb713994235ed8315a7970e52a9586451f7672e8414b13c8e7fe445ebc Dec 16 07:05:19 crc kubenswrapper[4789]: I1216 07:05:19.410590 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" event={"ID":"d3e603c6-bac1-496b-bf35-1e8124144121","Type":"ContainerStarted","Data":"31e9f2e549997715b462155f57b83c5f45751eebf77f70b1bdc06dbd64ccf553"} Dec 16 07:05:19 crc kubenswrapper[4789]: I1216 07:05:19.411965 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" event={"ID":"312b3314-cd6b-422b-910e-9fdf5df3d594","Type":"ContainerStarted","Data":"5eafc6eb713994235ed8315a7970e52a9586451f7672e8414b13c8e7fe445ebc"} Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.422744 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bfwks" event={"ID":"2febe2b7-c4da-4fca-bb73-6e4e3bc19c36","Type":"ContainerStarted","Data":"974c2983dbaef48a6eeceb3a8022710de15ed40509e505abadc2addc9d75dd5a"} Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.423278 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.424016 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" event={"ID":"ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28","Type":"ContainerStarted","Data":"1f14b7cfaa7e986a17e844715dcd3f0c7555f434bc6a8d1d68f6a678bc166ea8"} Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.425029 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" event={"ID":"d3e603c6-bac1-496b-bf35-1e8124144121","Type":"ContainerStarted","Data":"baa14410620d23f0ab9d1142fae8f47e16035b54d186ba35211df5629c847fce"} Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.425226 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.439317 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bfwks" podStartSLOduration=1.833519653 podStartE2EDuration="4.439302298s" podCreationTimestamp="2025-12-16 07:05:17 +0000 UTC" firstStartedPulling="2025-12-16 07:05:17.821114805 +0000 UTC m=+856.083002434" lastFinishedPulling="2025-12-16 07:05:20.42689744 +0000 UTC m=+858.688785079" observedRunningTime="2025-12-16 07:05:21.437144074 +0000 UTC m=+859.699031723" watchObservedRunningTime="2025-12-16 07:05:21.439302298 +0000 UTC m=+859.701189927" Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.457969 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" podStartSLOduration=2.415333204 podStartE2EDuration="4.457949648s" podCreationTimestamp="2025-12-16 07:05:17 +0000 UTC" firstStartedPulling="2025-12-16 07:05:18.543877799 +0000 UTC m=+856.805765428" lastFinishedPulling="2025-12-16 07:05:20.586494243 +0000 UTC m=+858.848381872" observedRunningTime="2025-12-16 07:05:21.451041334 +0000 UTC m=+859.712928983" watchObservedRunningTime="2025-12-16 07:05:21.457949648 +0000 UTC m=+859.719837277" Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.927654 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.928205 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.928254 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.928669 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3851c1899da6a57a194edd039ca6372a9930890332c280cff4f36f157e8d3272"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:05:21 crc kubenswrapper[4789]: I1216 07:05:21.928730 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://3851c1899da6a57a194edd039ca6372a9930890332c280cff4f36f157e8d3272" gracePeriod=600 Dec 16 07:05:23 crc kubenswrapper[4789]: I1216 07:05:23.447655 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="3851c1899da6a57a194edd039ca6372a9930890332c280cff4f36f157e8d3272" exitCode=0 Dec 16 07:05:23 crc kubenswrapper[4789]: I1216 07:05:23.447817 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"3851c1899da6a57a194edd039ca6372a9930890332c280cff4f36f157e8d3272"} Dec 16 07:05:23 crc kubenswrapper[4789]: I1216 07:05:23.448248 4789 scope.go:117] "RemoveContainer" containerID="d23462b9fda7f227b342d36364779a64e2d961db504204feb74ea1aa2b298979" Dec 16 07:05:23 crc kubenswrapper[4789]: I1216 07:05:23.451812 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" event={"ID":"312b3314-cd6b-422b-910e-9fdf5df3d594","Type":"ContainerStarted","Data":"f1c78ec391a8e09965ed71805cb06bff3f1510d6586494c60978861559d3affd"} Dec 16 07:05:24 crc kubenswrapper[4789]: I1216 07:05:24.459073 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"b5498247db061c67566479b4544d243bb1272801b3a301b0847cb7fdd1e323de"} Dec 16 07:05:24 crc kubenswrapper[4789]: I1216 07:05:24.461463 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" event={"ID":"ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28","Type":"ContainerStarted","Data":"2a5caaba04bb8644ae843322d4637cfba2c7fa791b6914e49ff9393dbc145adf"} Dec 16 07:05:24 crc kubenswrapper[4789]: I1216 07:05:24.476731 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-2gnhj" podStartSLOduration=3.541222515 podStartE2EDuration="7.476710268s" podCreationTimestamp="2025-12-16 07:05:17 +0000 UTC" firstStartedPulling="2025-12-16 07:05:18.999062825 +0000 UTC m=+857.260950464" lastFinishedPulling="2025-12-16 07:05:22.934550588 +0000 UTC m=+861.196438217" observedRunningTime="2025-12-16 07:05:23.465171541 +0000 UTC m=+861.727059170" watchObservedRunningTime="2025-12-16 07:05:24.476710268 +0000 UTC m=+862.738597917" Dec 16 07:05:24 crc kubenswrapper[4789]: I1216 07:05:24.494379 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-t4fl7" podStartSLOduration=1.975114723 podStartE2EDuration="7.494361511s" podCreationTimestamp="2025-12-16 07:05:17 +0000 UTC" firstStartedPulling="2025-12-16 07:05:17.987355195 +0000 UTC m=+856.249242824" lastFinishedPulling="2025-12-16 07:05:23.506601983 +0000 UTC m=+861.768489612" observedRunningTime="2025-12-16 07:05:24.492675479 +0000 UTC m=+862.754563128" watchObservedRunningTime="2025-12-16 07:05:24.494361511 +0000 UTC m=+862.756249140" Dec 16 07:05:27 crc kubenswrapper[4789]: I1216 07:05:27.212287 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:27 crc kubenswrapper[4789]: I1216 07:05:27.250795 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:27 crc kubenswrapper[4789]: I1216 07:05:27.450892 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7ffk"] Dec 16 07:05:27 crc kubenswrapper[4789]: I1216 07:05:27.806039 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bfwks" Dec 16 07:05:28 crc kubenswrapper[4789]: I1216 07:05:28.058153 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:28 crc kubenswrapper[4789]: I1216 07:05:28.058221 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:28 crc kubenswrapper[4789]: I1216 07:05:28.062031 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:28 crc kubenswrapper[4789]: I1216 07:05:28.483270 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7ffk" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="registry-server" containerID="cri-o://e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df" gracePeriod=2 Dec 16 07:05:28 crc kubenswrapper[4789]: I1216 07:05:28.489285 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d7d67469c-bcmqz" Dec 16 07:05:28 crc kubenswrapper[4789]: I1216 07:05:28.541600 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cbqb2"] Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.315217 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.418197 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbfmk\" (UniqueName: \"kubernetes.io/projected/33253be4-48c8-45fa-94d2-1f58434c8a7f-kube-api-access-kbfmk\") pod \"33253be4-48c8-45fa-94d2-1f58434c8a7f\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.418603 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-catalog-content\") pod \"33253be4-48c8-45fa-94d2-1f58434c8a7f\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.418629 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-utilities\") pod \"33253be4-48c8-45fa-94d2-1f58434c8a7f\" (UID: \"33253be4-48c8-45fa-94d2-1f58434c8a7f\") " Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.419557 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-utilities" (OuterVolumeSpecName: "utilities") pod "33253be4-48c8-45fa-94d2-1f58434c8a7f" (UID: "33253be4-48c8-45fa-94d2-1f58434c8a7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.423719 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33253be4-48c8-45fa-94d2-1f58434c8a7f-kube-api-access-kbfmk" (OuterVolumeSpecName: "kube-api-access-kbfmk") pod "33253be4-48c8-45fa-94d2-1f58434c8a7f" (UID: "33253be4-48c8-45fa-94d2-1f58434c8a7f"). InnerVolumeSpecName "kube-api-access-kbfmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.490086 4789 generic.go:334] "Generic (PLEG): container finished" podID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerID="e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df" exitCode=0 Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.490162 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7ffk" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.490149 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ffk" event={"ID":"33253be4-48c8-45fa-94d2-1f58434c8a7f","Type":"ContainerDied","Data":"e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df"} Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.490242 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7ffk" event={"ID":"33253be4-48c8-45fa-94d2-1f58434c8a7f","Type":"ContainerDied","Data":"07cdd563a880bc2edb24b6c662fffea29e9cc2626c25bd59dff91a2eac6d152c"} Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.490266 4789 scope.go:117] "RemoveContainer" containerID="e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.510001 4789 scope.go:117] "RemoveContainer" containerID="e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.520379 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbfmk\" (UniqueName: \"kubernetes.io/projected/33253be4-48c8-45fa-94d2-1f58434c8a7f-kube-api-access-kbfmk\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.520411 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.528488 4789 scope.go:117] "RemoveContainer" containerID="6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.531111 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33253be4-48c8-45fa-94d2-1f58434c8a7f" (UID: "33253be4-48c8-45fa-94d2-1f58434c8a7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.544505 4789 scope.go:117] "RemoveContainer" containerID="e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df" Dec 16 07:05:29 crc kubenswrapper[4789]: E1216 07:05:29.545324 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df\": container with ID starting with e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df not found: ID does not exist" containerID="e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.545365 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df"} err="failed to get container status \"e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df\": rpc error: code = NotFound desc = could not find container \"e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df\": container with ID starting with e40a78f0c5342bec81e883868062ed745be6a1257131f28c1c056464b819a4df not found: ID does not exist" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.545396 4789 scope.go:117] "RemoveContainer" containerID="e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832" Dec 16 07:05:29 crc kubenswrapper[4789]: E1216 07:05:29.545692 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832\": container with ID starting with e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832 not found: ID does not exist" containerID="e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.545725 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832"} err="failed to get container status \"e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832\": rpc error: code = NotFound desc = could not find container \"e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832\": container with ID starting with e06486e5c2edb5bdd1bb994dbacd9006b5dab309f79bc4b5a7a96097b37e4832 not found: ID does not exist" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.545745 4789 scope.go:117] "RemoveContainer" containerID="6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1" Dec 16 07:05:29 crc kubenswrapper[4789]: E1216 07:05:29.546034 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1\": container with ID starting with 6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1 not found: ID does not exist" containerID="6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.546059 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1"} err="failed to get container status \"6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1\": rpc error: code = NotFound desc = could not find container \"6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1\": container with ID starting with 6204bf2601a6c82eb4d8585c9e6dcad8310cf6599968088effcc8ee8ff3cfbe1 not found: ID does not exist" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.621891 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33253be4-48c8-45fa-94d2-1f58434c8a7f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.821975 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7ffk"] Dec 16 07:05:29 crc kubenswrapper[4789]: I1216 07:05:29.825086 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7ffk"] Dec 16 07:05:30 crc kubenswrapper[4789]: I1216 07:05:30.113840 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" path="/var/lib/kubelet/pods/33253be4-48c8-45fa-94d2-1f58434c8a7f/volumes" Dec 16 07:05:38 crc kubenswrapper[4789]: I1216 07:05:38.366639 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-pl68l" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.307051 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk"] Dec 16 07:05:49 crc kubenswrapper[4789]: E1216 07:05:49.310084 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="extract-content" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.310197 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="extract-content" Dec 16 07:05:49 crc kubenswrapper[4789]: E1216 07:05:49.310283 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="registry-server" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.310357 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="registry-server" Dec 16 07:05:49 crc kubenswrapper[4789]: E1216 07:05:49.310457 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="extract-utilities" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.310549 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="extract-utilities" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.310793 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="33253be4-48c8-45fa-94d2-1f58434c8a7f" containerName="registry-server" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.312042 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.314374 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.316343 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk"] Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.388943 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.389077 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.389184 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4btpl\" (UniqueName: \"kubernetes.io/projected/5167ca38-0011-4e95-81d3-48e193836144-kube-api-access-4btpl\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.490608 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4btpl\" (UniqueName: \"kubernetes.io/projected/5167ca38-0011-4e95-81d3-48e193836144-kube-api-access-4btpl\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.490669 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.490713 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.491280 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.491413 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.509686 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4btpl\" (UniqueName: \"kubernetes.io/projected/5167ca38-0011-4e95-81d3-48e193836144-kube-api-access-4btpl\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.645861 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:49 crc kubenswrapper[4789]: I1216 07:05:49.817550 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk"] Dec 16 07:05:50 crc kubenswrapper[4789]: I1216 07:05:50.610638 4789 generic.go:334] "Generic (PLEG): container finished" podID="5167ca38-0011-4e95-81d3-48e193836144" containerID="fd0ed3257d966f074ce83808060806585446b0db2e29b360c23441a9a4692a78" exitCode=0 Dec 16 07:05:50 crc kubenswrapper[4789]: I1216 07:05:50.610695 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" event={"ID":"5167ca38-0011-4e95-81d3-48e193836144","Type":"ContainerDied","Data":"fd0ed3257d966f074ce83808060806585446b0db2e29b360c23441a9a4692a78"} Dec 16 07:05:50 crc kubenswrapper[4789]: I1216 07:05:50.610928 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" event={"ID":"5167ca38-0011-4e95-81d3-48e193836144","Type":"ContainerStarted","Data":"b4948a5654e679fefd54e29f33f17efa6f0dfe2b14823ebcac53f5226cb4baee"} Dec 16 07:05:52 crc kubenswrapper[4789]: I1216 07:05:52.623220 4789 generic.go:334] "Generic (PLEG): container finished" podID="5167ca38-0011-4e95-81d3-48e193836144" containerID="f14b6bc25561ec19380e0eff46fddf6da2acab2bef4e318fa0d244c951b7adbb" exitCode=0 Dec 16 07:05:52 crc kubenswrapper[4789]: I1216 07:05:52.623323 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" event={"ID":"5167ca38-0011-4e95-81d3-48e193836144","Type":"ContainerDied","Data":"f14b6bc25561ec19380e0eff46fddf6da2acab2bef4e318fa0d244c951b7adbb"} Dec 16 07:05:53 crc kubenswrapper[4789]: I1216 07:05:53.583868 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-cbqb2" podUID="28e992ee-e81f-46d7-b422-27fa3023b7d8" containerName="console" containerID="cri-o://a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501" gracePeriod=15 Dec 16 07:05:53 crc kubenswrapper[4789]: I1216 07:05:53.649171 4789 generic.go:334] "Generic (PLEG): container finished" podID="5167ca38-0011-4e95-81d3-48e193836144" containerID="dd59d5358db9ce915f8026776e61bd9e583c005f93790f982d671270f25fdf3b" exitCode=0 Dec 16 07:05:53 crc kubenswrapper[4789]: I1216 07:05:53.649227 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" event={"ID":"5167ca38-0011-4e95-81d3-48e193836144","Type":"ContainerDied","Data":"dd59d5358db9ce915f8026776e61bd9e583c005f93790f982d671270f25fdf3b"} Dec 16 07:05:53 crc kubenswrapper[4789]: I1216 07:05:53.919439 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cbqb2_28e992ee-e81f-46d7-b422-27fa3023b7d8/console/0.log" Dec 16 07:05:53 crc kubenswrapper[4789]: I1216 07:05:53.919828 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049008 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-config\") pod \"28e992ee-e81f-46d7-b422-27fa3023b7d8\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049041 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-oauth-serving-cert\") pod \"28e992ee-e81f-46d7-b422-27fa3023b7d8\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049082 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-trusted-ca-bundle\") pod \"28e992ee-e81f-46d7-b422-27fa3023b7d8\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049125 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-oauth-config\") pod \"28e992ee-e81f-46d7-b422-27fa3023b7d8\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049161 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-service-ca\") pod \"28e992ee-e81f-46d7-b422-27fa3023b7d8\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049217 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtnng\" (UniqueName: \"kubernetes.io/projected/28e992ee-e81f-46d7-b422-27fa3023b7d8-kube-api-access-dtnng\") pod \"28e992ee-e81f-46d7-b422-27fa3023b7d8\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049241 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-serving-cert\") pod \"28e992ee-e81f-46d7-b422-27fa3023b7d8\" (UID: \"28e992ee-e81f-46d7-b422-27fa3023b7d8\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049936 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "28e992ee-e81f-46d7-b422-27fa3023b7d8" (UID: "28e992ee-e81f-46d7-b422-27fa3023b7d8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.049947 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-config" (OuterVolumeSpecName: "console-config") pod "28e992ee-e81f-46d7-b422-27fa3023b7d8" (UID: "28e992ee-e81f-46d7-b422-27fa3023b7d8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.050039 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "28e992ee-e81f-46d7-b422-27fa3023b7d8" (UID: "28e992ee-e81f-46d7-b422-27fa3023b7d8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.050329 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-service-ca" (OuterVolumeSpecName: "service-ca") pod "28e992ee-e81f-46d7-b422-27fa3023b7d8" (UID: "28e992ee-e81f-46d7-b422-27fa3023b7d8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.055468 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e992ee-e81f-46d7-b422-27fa3023b7d8-kube-api-access-dtnng" (OuterVolumeSpecName: "kube-api-access-dtnng") pod "28e992ee-e81f-46d7-b422-27fa3023b7d8" (UID: "28e992ee-e81f-46d7-b422-27fa3023b7d8"). InnerVolumeSpecName "kube-api-access-dtnng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.056164 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "28e992ee-e81f-46d7-b422-27fa3023b7d8" (UID: "28e992ee-e81f-46d7-b422-27fa3023b7d8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.063391 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "28e992ee-e81f-46d7-b422-27fa3023b7d8" (UID: "28e992ee-e81f-46d7-b422-27fa3023b7d8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.150721 4789 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.150780 4789 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.150834 4789 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.151001 4789 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.151049 4789 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28e992ee-e81f-46d7-b422-27fa3023b7d8-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.151071 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtnng\" (UniqueName: \"kubernetes.io/projected/28e992ee-e81f-46d7-b422-27fa3023b7d8-kube-api-access-dtnng\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.151090 4789 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28e992ee-e81f-46d7-b422-27fa3023b7d8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.656355 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-cbqb2_28e992ee-e81f-46d7-b422-27fa3023b7d8/console/0.log" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.656431 4789 generic.go:334] "Generic (PLEG): container finished" podID="28e992ee-e81f-46d7-b422-27fa3023b7d8" containerID="a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501" exitCode=2 Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.656753 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-cbqb2" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.657871 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cbqb2" event={"ID":"28e992ee-e81f-46d7-b422-27fa3023b7d8","Type":"ContainerDied","Data":"a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501"} Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.657935 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-cbqb2" event={"ID":"28e992ee-e81f-46d7-b422-27fa3023b7d8","Type":"ContainerDied","Data":"1f1bec6865de399183647305ba1c580d76511b4f2703e7b3ecd7c92425ef06ff"} Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.657963 4789 scope.go:117] "RemoveContainer" containerID="a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.680531 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-cbqb2"] Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.681058 4789 scope.go:117] "RemoveContainer" containerID="a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501" Dec 16 07:05:54 crc kubenswrapper[4789]: E1216 07:05:54.683154 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501\": container with ID starting with a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501 not found: ID does not exist" containerID="a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.683188 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501"} err="failed to get container status \"a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501\": rpc error: code = NotFound desc = could not find container \"a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501\": container with ID starting with a5ef7ee0a6743101d2fdb1064ef4ca6204ac5f4526d45cd93087e8e937deb501 not found: ID does not exist" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.685337 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-cbqb2"] Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.907868 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.965847 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4btpl\" (UniqueName: \"kubernetes.io/projected/5167ca38-0011-4e95-81d3-48e193836144-kube-api-access-4btpl\") pod \"5167ca38-0011-4e95-81d3-48e193836144\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.966020 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-bundle\") pod \"5167ca38-0011-4e95-81d3-48e193836144\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.966064 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-util\") pod \"5167ca38-0011-4e95-81d3-48e193836144\" (UID: \"5167ca38-0011-4e95-81d3-48e193836144\") " Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.966872 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-bundle" (OuterVolumeSpecName: "bundle") pod "5167ca38-0011-4e95-81d3-48e193836144" (UID: "5167ca38-0011-4e95-81d3-48e193836144"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.969904 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5167ca38-0011-4e95-81d3-48e193836144-kube-api-access-4btpl" (OuterVolumeSpecName: "kube-api-access-4btpl") pod "5167ca38-0011-4e95-81d3-48e193836144" (UID: "5167ca38-0011-4e95-81d3-48e193836144"). InnerVolumeSpecName "kube-api-access-4btpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:05:54 crc kubenswrapper[4789]: I1216 07:05:54.980299 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-util" (OuterVolumeSpecName: "util") pod "5167ca38-0011-4e95-81d3-48e193836144" (UID: "5167ca38-0011-4e95-81d3-48e193836144"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:05:55 crc kubenswrapper[4789]: I1216 07:05:55.069175 4789 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:55 crc kubenswrapper[4789]: I1216 07:05:55.069273 4789 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5167ca38-0011-4e95-81d3-48e193836144-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:55 crc kubenswrapper[4789]: I1216 07:05:55.069295 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4btpl\" (UniqueName: \"kubernetes.io/projected/5167ca38-0011-4e95-81d3-48e193836144-kube-api-access-4btpl\") on node \"crc\" DevicePath \"\"" Dec 16 07:05:55 crc kubenswrapper[4789]: I1216 07:05:55.665240 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" Dec 16 07:05:55 crc kubenswrapper[4789]: I1216 07:05:55.665221 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk" event={"ID":"5167ca38-0011-4e95-81d3-48e193836144","Type":"ContainerDied","Data":"b4948a5654e679fefd54e29f33f17efa6f0dfe2b14823ebcac53f5226cb4baee"} Dec 16 07:05:55 crc kubenswrapper[4789]: I1216 07:05:55.665416 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4948a5654e679fefd54e29f33f17efa6f0dfe2b14823ebcac53f5226cb4baee" Dec 16 07:05:56 crc kubenswrapper[4789]: I1216 07:05:56.113663 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e992ee-e81f-46d7-b422-27fa3023b7d8" path="/var/lib/kubelet/pods/28e992ee-e81f-46d7-b422-27fa3023b7d8/volumes" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.194887 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4"] Dec 16 07:06:04 crc kubenswrapper[4789]: E1216 07:06:04.195636 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e992ee-e81f-46d7-b422-27fa3023b7d8" containerName="console" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.195652 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e992ee-e81f-46d7-b422-27fa3023b7d8" containerName="console" Dec 16 07:06:04 crc kubenswrapper[4789]: E1216 07:06:04.195669 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5167ca38-0011-4e95-81d3-48e193836144" containerName="pull" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.195675 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5167ca38-0011-4e95-81d3-48e193836144" containerName="pull" Dec 16 07:06:04 crc kubenswrapper[4789]: E1216 07:06:04.195688 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5167ca38-0011-4e95-81d3-48e193836144" containerName="extract" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.195694 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5167ca38-0011-4e95-81d3-48e193836144" containerName="extract" Dec 16 07:06:04 crc kubenswrapper[4789]: E1216 07:06:04.195703 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5167ca38-0011-4e95-81d3-48e193836144" containerName="util" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.195710 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5167ca38-0011-4e95-81d3-48e193836144" containerName="util" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.195807 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5167ca38-0011-4e95-81d3-48e193836144" containerName="extract" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.195820 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e992ee-e81f-46d7-b422-27fa3023b7d8" containerName="console" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.196214 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.198029 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.198608 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.199007 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.199174 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9xgk2" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.200943 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.223561 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4"] Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.261870 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14d02d92-8bff-4937-9b63-13592d6626fc-webhook-cert\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.261960 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqcr9\" (UniqueName: \"kubernetes.io/projected/14d02d92-8bff-4937-9b63-13592d6626fc-kube-api-access-hqcr9\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.261984 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14d02d92-8bff-4937-9b63-13592d6626fc-apiservice-cert\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.362571 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqcr9\" (UniqueName: \"kubernetes.io/projected/14d02d92-8bff-4937-9b63-13592d6626fc-kube-api-access-hqcr9\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.362622 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14d02d92-8bff-4937-9b63-13592d6626fc-apiservice-cert\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.362699 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14d02d92-8bff-4937-9b63-13592d6626fc-webhook-cert\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.380784 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14d02d92-8bff-4937-9b63-13592d6626fc-webhook-cert\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.381165 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14d02d92-8bff-4937-9b63-13592d6626fc-apiservice-cert\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.386738 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqcr9\" (UniqueName: \"kubernetes.io/projected/14d02d92-8bff-4937-9b63-13592d6626fc-kube-api-access-hqcr9\") pod \"metallb-operator-controller-manager-6bffc6b469-npjk4\" (UID: \"14d02d92-8bff-4937-9b63-13592d6626fc\") " pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.463573 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c"] Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.464279 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.468536 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.469308 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-g6kpw" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.475028 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.480143 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c"] Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.510231 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.665755 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bb94b2b-4f6d-4600-aa95-93751de6c723-webhook-cert\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.665804 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bb94b2b-4f6d-4600-aa95-93751de6c723-apiservice-cert\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.665990 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pmj8\" (UniqueName: \"kubernetes.io/projected/0bb94b2b-4f6d-4600-aa95-93751de6c723-kube-api-access-6pmj8\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.768963 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bb94b2b-4f6d-4600-aa95-93751de6c723-webhook-cert\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.769287 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bb94b2b-4f6d-4600-aa95-93751de6c723-apiservice-cert\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.769310 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pmj8\" (UniqueName: \"kubernetes.io/projected/0bb94b2b-4f6d-4600-aa95-93751de6c723-kube-api-access-6pmj8\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.774720 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bb94b2b-4f6d-4600-aa95-93751de6c723-apiservice-cert\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.776934 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bb94b2b-4f6d-4600-aa95-93751de6c723-webhook-cert\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.801535 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pmj8\" (UniqueName: \"kubernetes.io/projected/0bb94b2b-4f6d-4600-aa95-93751de6c723-kube-api-access-6pmj8\") pod \"metallb-operator-webhook-server-5d79c48cdb-tn64c\" (UID: \"0bb94b2b-4f6d-4600-aa95-93751de6c723\") " pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:04 crc kubenswrapper[4789]: I1216 07:06:04.814505 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4"] Dec 16 07:06:04 crc kubenswrapper[4789]: W1216 07:06:04.821495 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d02d92_8bff_4937_9b63_13592d6626fc.slice/crio-e4a23d591a75875c8399168e2a3bcff4c849ac839282834dab99696fe88c6d05 WatchSource:0}: Error finding container e4a23d591a75875c8399168e2a3bcff4c849ac839282834dab99696fe88c6d05: Status 404 returned error can't find the container with id e4a23d591a75875c8399168e2a3bcff4c849ac839282834dab99696fe88c6d05 Dec 16 07:06:05 crc kubenswrapper[4789]: I1216 07:06:05.077413 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:05 crc kubenswrapper[4789]: I1216 07:06:05.282770 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c"] Dec 16 07:06:05 crc kubenswrapper[4789]: W1216 07:06:05.282961 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bb94b2b_4f6d_4600_aa95_93751de6c723.slice/crio-f6cef8bbd87c6ba8fde4485f940972add63267e5917fadfd6279df3e5671a45f WatchSource:0}: Error finding container f6cef8bbd87c6ba8fde4485f940972add63267e5917fadfd6279df3e5671a45f: Status 404 returned error can't find the container with id f6cef8bbd87c6ba8fde4485f940972add63267e5917fadfd6279df3e5671a45f Dec 16 07:06:05 crc kubenswrapper[4789]: I1216 07:06:05.756398 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" event={"ID":"14d02d92-8bff-4937-9b63-13592d6626fc","Type":"ContainerStarted","Data":"e4a23d591a75875c8399168e2a3bcff4c849ac839282834dab99696fe88c6d05"} Dec 16 07:06:05 crc kubenswrapper[4789]: I1216 07:06:05.757885 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" event={"ID":"0bb94b2b-4f6d-4600-aa95-93751de6c723","Type":"ContainerStarted","Data":"f6cef8bbd87c6ba8fde4485f940972add63267e5917fadfd6279df3e5671a45f"} Dec 16 07:06:09 crc kubenswrapper[4789]: I1216 07:06:09.800703 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" event={"ID":"14d02d92-8bff-4937-9b63-13592d6626fc","Type":"ContainerStarted","Data":"24ef3a47cccc63c1784438af2d136fb3cdb2a01cbadf1efc4393332eb79768ed"} Dec 16 07:06:09 crc kubenswrapper[4789]: I1216 07:06:09.801255 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:09 crc kubenswrapper[4789]: I1216 07:06:09.803477 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" event={"ID":"0bb94b2b-4f6d-4600-aa95-93751de6c723","Type":"ContainerStarted","Data":"406ff2d613b8f211a51512dd02c0dff480abcc8df3825199e39c780a81449b77"} Dec 16 07:06:09 crc kubenswrapper[4789]: I1216 07:06:09.803862 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:09 crc kubenswrapper[4789]: I1216 07:06:09.826029 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" podStartSLOduration=1.27301291 podStartE2EDuration="5.825998271s" podCreationTimestamp="2025-12-16 07:06:04 +0000 UTC" firstStartedPulling="2025-12-16 07:06:04.824810137 +0000 UTC m=+903.086697756" lastFinishedPulling="2025-12-16 07:06:09.377795488 +0000 UTC m=+907.639683117" observedRunningTime="2025-12-16 07:06:09.821718806 +0000 UTC m=+908.083606445" watchObservedRunningTime="2025-12-16 07:06:09.825998271 +0000 UTC m=+908.087885950" Dec 16 07:06:09 crc kubenswrapper[4789]: I1216 07:06:09.845804 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" podStartSLOduration=1.722026252 podStartE2EDuration="5.845787998s" podCreationTimestamp="2025-12-16 07:06:04 +0000 UTC" firstStartedPulling="2025-12-16 07:06:05.284348213 +0000 UTC m=+903.546235842" lastFinishedPulling="2025-12-16 07:06:09.408109959 +0000 UTC m=+907.669997588" observedRunningTime="2025-12-16 07:06:09.843638295 +0000 UTC m=+908.105525934" watchObservedRunningTime="2025-12-16 07:06:09.845787998 +0000 UTC m=+908.107675627" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.577853 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d9fvr"] Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.579405 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.592865 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d9fvr"] Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.690252 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-catalog-content\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.690296 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj2br\" (UniqueName: \"kubernetes.io/projected/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-kube-api-access-mj2br\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.690338 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-utilities\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.791484 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-utilities\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.791585 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-catalog-content\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.791608 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj2br\" (UniqueName: \"kubernetes.io/projected/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-kube-api-access-mj2br\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.792092 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-utilities\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.792116 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-catalog-content\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.817348 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj2br\" (UniqueName: \"kubernetes.io/projected/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-kube-api-access-mj2br\") pod \"community-operators-d9fvr\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:22 crc kubenswrapper[4789]: I1216 07:06:22.898671 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:23 crc kubenswrapper[4789]: I1216 07:06:23.415364 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d9fvr"] Dec 16 07:06:23 crc kubenswrapper[4789]: I1216 07:06:23.871413 4789 generic.go:334] "Generic (PLEG): container finished" podID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerID="19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f" exitCode=0 Dec 16 07:06:23 crc kubenswrapper[4789]: I1216 07:06:23.871624 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9fvr" event={"ID":"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3","Type":"ContainerDied","Data":"19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f"} Dec 16 07:06:23 crc kubenswrapper[4789]: I1216 07:06:23.871771 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9fvr" event={"ID":"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3","Type":"ContainerStarted","Data":"c6ea57c6789934b6d8fbca282245778203ddde58834e3f385b682832e420dcf4"} Dec 16 07:06:24 crc kubenswrapper[4789]: I1216 07:06:24.881448 4789 generic.go:334] "Generic (PLEG): container finished" podID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerID="ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc" exitCode=0 Dec 16 07:06:24 crc kubenswrapper[4789]: I1216 07:06:24.881637 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9fvr" event={"ID":"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3","Type":"ContainerDied","Data":"ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc"} Dec 16 07:06:25 crc kubenswrapper[4789]: I1216 07:06:25.081533 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5d79c48cdb-tn64c" Dec 16 07:06:25 crc kubenswrapper[4789]: I1216 07:06:25.889240 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9fvr" event={"ID":"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3","Type":"ContainerStarted","Data":"9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632"} Dec 16 07:06:25 crc kubenswrapper[4789]: I1216 07:06:25.906439 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d9fvr" podStartSLOduration=2.385037551 podStartE2EDuration="3.906423056s" podCreationTimestamp="2025-12-16 07:06:22 +0000 UTC" firstStartedPulling="2025-12-16 07:06:23.874064515 +0000 UTC m=+922.135952134" lastFinishedPulling="2025-12-16 07:06:25.39545001 +0000 UTC m=+923.657337639" observedRunningTime="2025-12-16 07:06:25.90538999 +0000 UTC m=+924.167277629" watchObservedRunningTime="2025-12-16 07:06:25.906423056 +0000 UTC m=+924.168310685" Dec 16 07:06:32 crc kubenswrapper[4789]: I1216 07:06:32.898988 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:32 crc kubenswrapper[4789]: I1216 07:06:32.899521 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:32 crc kubenswrapper[4789]: I1216 07:06:32.960938 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:33 crc kubenswrapper[4789]: I1216 07:06:33.005723 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:34 crc kubenswrapper[4789]: I1216 07:06:34.982043 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ztph7"] Dec 16 07:06:34 crc kubenswrapper[4789]: I1216 07:06:34.983681 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:34 crc kubenswrapper[4789]: I1216 07:06:34.999232 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztph7"] Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.060037 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-utilities\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.060210 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-catalog-content\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.060277 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg84b\" (UniqueName: \"kubernetes.io/projected/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-kube-api-access-xg84b\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.161700 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-catalog-content\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.161771 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg84b\" (UniqueName: \"kubernetes.io/projected/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-kube-api-access-xg84b\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.161809 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-utilities\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.162296 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-catalog-content\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.162354 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-utilities\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.190257 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg84b\" (UniqueName: \"kubernetes.io/projected/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-kube-api-access-xg84b\") pod \"redhat-marketplace-ztph7\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.301549 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.501317 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztph7"] Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.966298 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztph7" event={"ID":"e697dbe4-6faf-419e-9dd1-79e1ec80ab34","Type":"ContainerStarted","Data":"feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b"} Dec 16 07:06:35 crc kubenswrapper[4789]: I1216 07:06:35.966351 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztph7" event={"ID":"e697dbe4-6faf-419e-9dd1-79e1ec80ab34","Type":"ContainerStarted","Data":"ef53a8a0fc6ef1f36e9b18fcfe90d43824f755c713c9063868cd94e1bb900f6b"} Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.568304 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d9fvr"] Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.568794 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d9fvr" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="registry-server" containerID="cri-o://9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632" gracePeriod=2 Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.937448 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.973366 4789 generic.go:334] "Generic (PLEG): container finished" podID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerID="9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632" exitCode=0 Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.973442 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9fvr" Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.973448 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9fvr" event={"ID":"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3","Type":"ContainerDied","Data":"9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632"} Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.973583 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9fvr" event={"ID":"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3","Type":"ContainerDied","Data":"c6ea57c6789934b6d8fbca282245778203ddde58834e3f385b682832e420dcf4"} Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.973614 4789 scope.go:117] "RemoveContainer" containerID="9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632" Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.974825 4789 generic.go:334] "Generic (PLEG): container finished" podID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerID="feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b" exitCode=0 Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.974849 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztph7" event={"ID":"e697dbe4-6faf-419e-9dd1-79e1ec80ab34","Type":"ContainerDied","Data":"feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b"} Dec 16 07:06:36 crc kubenswrapper[4789]: I1216 07:06:36.989560 4789 scope.go:117] "RemoveContainer" containerID="ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.003364 4789 scope.go:117] "RemoveContainer" containerID="19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.022868 4789 scope.go:117] "RemoveContainer" containerID="9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632" Dec 16 07:06:37 crc kubenswrapper[4789]: E1216 07:06:37.023252 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632\": container with ID starting with 9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632 not found: ID does not exist" containerID="9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.023289 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632"} err="failed to get container status \"9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632\": rpc error: code = NotFound desc = could not find container \"9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632\": container with ID starting with 9cf50aa31f07e5f5f75ecefa7b063d85d982c89ed79eddc0b476af18b5283632 not found: ID does not exist" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.023316 4789 scope.go:117] "RemoveContainer" containerID="ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc" Dec 16 07:06:37 crc kubenswrapper[4789]: E1216 07:06:37.023598 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc\": container with ID starting with ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc not found: ID does not exist" containerID="ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.023627 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc"} err="failed to get container status \"ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc\": rpc error: code = NotFound desc = could not find container \"ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc\": container with ID starting with ddf27ce18fdc7cdc78791fb1c9d2f113f8e503a31031b99716836504ae59a9fc not found: ID does not exist" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.023652 4789 scope.go:117] "RemoveContainer" containerID="19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f" Dec 16 07:06:37 crc kubenswrapper[4789]: E1216 07:06:37.024008 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f\": container with ID starting with 19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f not found: ID does not exist" containerID="19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.024052 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f"} err="failed to get container status \"19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f\": rpc error: code = NotFound desc = could not find container \"19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f\": container with ID starting with 19f78fb4ebbe631393a34d10b8538f6312e9801df0f1ffcabceb07a60cbb839f not found: ID does not exist" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.087230 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-catalog-content\") pod \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.087360 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-utilities\") pod \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.087492 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj2br\" (UniqueName: \"kubernetes.io/projected/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-kube-api-access-mj2br\") pod \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\" (UID: \"ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3\") " Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.088042 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-utilities" (OuterVolumeSpecName: "utilities") pod "ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" (UID: "ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.093899 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-kube-api-access-mj2br" (OuterVolumeSpecName: "kube-api-access-mj2br") pod "ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" (UID: "ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3"). InnerVolumeSpecName "kube-api-access-mj2br". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.133306 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" (UID: "ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.189078 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.189116 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj2br\" (UniqueName: \"kubernetes.io/projected/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-kube-api-access-mj2br\") on node \"crc\" DevicePath \"\"" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.189127 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.322340 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d9fvr"] Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.326141 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d9fvr"] Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.983807 4789 generic.go:334] "Generic (PLEG): container finished" podID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerID="45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a" exitCode=0 Dec 16 07:06:37 crc kubenswrapper[4789]: I1216 07:06:37.983898 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztph7" event={"ID":"e697dbe4-6faf-419e-9dd1-79e1ec80ab34","Type":"ContainerDied","Data":"45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a"} Dec 16 07:06:38 crc kubenswrapper[4789]: I1216 07:06:38.111646 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" path="/var/lib/kubelet/pods/ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3/volumes" Dec 16 07:06:38 crc kubenswrapper[4789]: I1216 07:06:38.993045 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztph7" event={"ID":"e697dbe4-6faf-419e-9dd1-79e1ec80ab34","Type":"ContainerStarted","Data":"f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d"} Dec 16 07:06:39 crc kubenswrapper[4789]: I1216 07:06:39.015088 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ztph7" podStartSLOduration=2.361332297 podStartE2EDuration="5.015066753s" podCreationTimestamp="2025-12-16 07:06:34 +0000 UTC" firstStartedPulling="2025-12-16 07:06:35.967954928 +0000 UTC m=+934.229842547" lastFinishedPulling="2025-12-16 07:06:38.621689364 +0000 UTC m=+936.883577003" observedRunningTime="2025-12-16 07:06:39.014362246 +0000 UTC m=+937.276249875" watchObservedRunningTime="2025-12-16 07:06:39.015066753 +0000 UTC m=+937.276954402" Dec 16 07:06:44 crc kubenswrapper[4789]: I1216 07:06:44.514305 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6bffc6b469-npjk4" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.257683 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rfbh5"] Dec 16 07:06:45 crc kubenswrapper[4789]: E1216 07:06:45.258235 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="extract-content" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.258250 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="extract-content" Dec 16 07:06:45 crc kubenswrapper[4789]: E1216 07:06:45.258269 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="registry-server" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.258277 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="registry-server" Dec 16 07:06:45 crc kubenswrapper[4789]: E1216 07:06:45.258284 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="extract-utilities" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.258291 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="extract-utilities" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.258389 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef14e3d8-6a06-4b01-a5a7-70cff8b87ce3" containerName="registry-server" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.260178 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.262483 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-mkdg5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.262647 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.262818 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.269479 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp"] Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.270129 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.271776 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.282149 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp"] Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.301950 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.302632 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.359865 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.372799 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zk7nk"] Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.374205 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.380235 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kj8z7" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.381028 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.381191 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.381324 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.390807 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-pjwhk"] Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.391881 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.395803 4789 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.404204 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-reloader\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.404428 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-startup\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.404522 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0535b2c-a6bb-4092-b481-ccba194fd9b4-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-5spwp\" (UID: \"f0535b2c-a6bb-4092-b481-ccba194fd9b4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.404607 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsqcz\" (UniqueName: \"kubernetes.io/projected/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-kube-api-access-xsqcz\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.404694 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7n68\" (UniqueName: \"kubernetes.io/projected/f0535b2c-a6bb-4092-b481-ccba194fd9b4-kube-api-access-v7n68\") pod \"frr-k8s-webhook-server-7784b6fcf-5spwp\" (UID: \"f0535b2c-a6bb-4092-b481-ccba194fd9b4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.404840 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-metrics\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.404942 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-metrics-certs\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.405035 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-sockets\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.405180 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-conf\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.416407 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-pjwhk"] Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.506990 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-metrics\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507075 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-metrics-certs\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507144 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-sockets\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507198 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507220 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-metrics-certs\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507241 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-metrics-certs\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507271 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-cert\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507298 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4ndw\" (UniqueName: \"kubernetes.io/projected/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-kube-api-access-f4ndw\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507328 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-conf\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507363 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-metallb-excludel2\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507392 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-reloader\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507417 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-startup\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507429 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-metrics\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507442 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0535b2c-a6bb-4092-b481-ccba194fd9b4-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-5spwp\" (UID: \"f0535b2c-a6bb-4092-b481-ccba194fd9b4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507500 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q77gz\" (UniqueName: \"kubernetes.io/projected/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-kube-api-access-q77gz\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507542 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsqcz\" (UniqueName: \"kubernetes.io/projected/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-kube-api-access-xsqcz\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.507571 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7n68\" (UniqueName: \"kubernetes.io/projected/f0535b2c-a6bb-4092-b481-ccba194fd9b4-kube-api-access-v7n68\") pod \"frr-k8s-webhook-server-7784b6fcf-5spwp\" (UID: \"f0535b2c-a6bb-4092-b481-ccba194fd9b4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.508191 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-conf\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.508390 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-sockets\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.508554 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-reloader\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.509538 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-frr-startup\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.521523 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-metrics-certs\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.522968 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0535b2c-a6bb-4092-b481-ccba194fd9b4-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-5spwp\" (UID: \"f0535b2c-a6bb-4092-b481-ccba194fd9b4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.524082 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsqcz\" (UniqueName: \"kubernetes.io/projected/d3ae31c7-d2d7-4a0f-b8eb-59d7be857994-kube-api-access-xsqcz\") pod \"frr-k8s-rfbh5\" (UID: \"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994\") " pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.530842 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7n68\" (UniqueName: \"kubernetes.io/projected/f0535b2c-a6bb-4092-b481-ccba194fd9b4-kube-api-access-v7n68\") pod \"frr-k8s-webhook-server-7784b6fcf-5spwp\" (UID: \"f0535b2c-a6bb-4092-b481-ccba194fd9b4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.580598 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.589111 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.609110 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.609156 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-metrics-certs\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.609177 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-metrics-certs\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.609204 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-cert\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.609232 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4ndw\" (UniqueName: \"kubernetes.io/projected/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-kube-api-access-f4ndw\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.609261 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-metallb-excludel2\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.609296 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q77gz\" (UniqueName: \"kubernetes.io/projected/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-kube-api-access-q77gz\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: E1216 07:06:45.609716 4789 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 07:06:45 crc kubenswrapper[4789]: E1216 07:06:45.609771 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist podName:fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6 nodeName:}" failed. No retries permitted until 2025-12-16 07:06:46.109751002 +0000 UTC m=+944.371638631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist") pod "speaker-zk7nk" (UID: "fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6") : secret "metallb-memberlist" not found Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.611587 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-metallb-excludel2\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.614441 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-metrics-certs\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.616732 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-metrics-certs\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.622299 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-cert\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.632414 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4ndw\" (UniqueName: \"kubernetes.io/projected/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-kube-api-access-f4ndw\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.633426 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q77gz\" (UniqueName: \"kubernetes.io/projected/f1fca8a9-546e-4fa3-b08f-cf5df54303e0-kube-api-access-q77gz\") pod \"controller-5bddd4b946-pjwhk\" (UID: \"f1fca8a9-546e-4fa3-b08f-cf5df54303e0\") " pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.738104 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.777125 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp"] Dec 16 07:06:45 crc kubenswrapper[4789]: W1216 07:06:45.791540 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0535b2c_a6bb_4092_b481_ccba194fd9b4.slice/crio-24a1abd25e2044d8670a84b2ade45abdadadd116f9594b2da84dfe31d7cd525a WatchSource:0}: Error finding container 24a1abd25e2044d8670a84b2ade45abdadadd116f9594b2da84dfe31d7cd525a: Status 404 returned error can't find the container with id 24a1abd25e2044d8670a84b2ade45abdadadd116f9594b2da84dfe31d7cd525a Dec 16 07:06:45 crc kubenswrapper[4789]: I1216 07:06:45.904618 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-pjwhk"] Dec 16 07:06:45 crc kubenswrapper[4789]: W1216 07:06:45.907801 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1fca8a9_546e_4fa3_b08f_cf5df54303e0.slice/crio-15ab0d22b99e16c2fe9aed167a28e067b5d3397da4495f30a7014dede73012c7 WatchSource:0}: Error finding container 15ab0d22b99e16c2fe9aed167a28e067b5d3397da4495f30a7014dede73012c7: Status 404 returned error can't find the container with id 15ab0d22b99e16c2fe9aed167a28e067b5d3397da4495f30a7014dede73012c7 Dec 16 07:06:46 crc kubenswrapper[4789]: I1216 07:06:46.032135 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" event={"ID":"f0535b2c-a6bb-4092-b481-ccba194fd9b4","Type":"ContainerStarted","Data":"24a1abd25e2044d8670a84b2ade45abdadadd116f9594b2da84dfe31d7cd525a"} Dec 16 07:06:46 crc kubenswrapper[4789]: I1216 07:06:46.033185 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-pjwhk" event={"ID":"f1fca8a9-546e-4fa3-b08f-cf5df54303e0","Type":"ContainerStarted","Data":"15ab0d22b99e16c2fe9aed167a28e067b5d3397da4495f30a7014dede73012c7"} Dec 16 07:06:46 crc kubenswrapper[4789]: I1216 07:06:46.071688 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:46 crc kubenswrapper[4789]: I1216 07:06:46.114004 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:46 crc kubenswrapper[4789]: E1216 07:06:46.114245 4789 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 07:06:46 crc kubenswrapper[4789]: E1216 07:06:46.114300 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist podName:fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6 nodeName:}" failed. No retries permitted until 2025-12-16 07:06:47.114284359 +0000 UTC m=+945.376171988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist") pod "speaker-zk7nk" (UID: "fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6") : secret "metallb-memberlist" not found Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.038952 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerStarted","Data":"00c0122b41841d573c9e0e36d31ff00f0dc563c018d929f3229841b71d832589"} Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.040829 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-pjwhk" event={"ID":"f1fca8a9-546e-4fa3-b08f-cf5df54303e0","Type":"ContainerStarted","Data":"faba09e21b4690969222cef732cb20f5106dddefff86ee7fa05bdf6b3edcd8fd"} Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.040862 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-pjwhk" event={"ID":"f1fca8a9-546e-4fa3-b08f-cf5df54303e0","Type":"ContainerStarted","Data":"4211a1dd42439478dcea35beaeccc80592391bf8dc12a26b443ed1a306c93202"} Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.125605 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.138105 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6-memberlist\") pod \"speaker-zk7nk\" (UID: \"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6\") " pod="metallb-system/speaker-zk7nk" Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.165989 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-pjwhk" podStartSLOduration=2.165742969 podStartE2EDuration="2.165742969s" podCreationTimestamp="2025-12-16 07:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:06:47.057552794 +0000 UTC m=+945.319440423" watchObservedRunningTime="2025-12-16 07:06:47.165742969 +0000 UTC m=+945.427630598" Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.168203 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztph7"] Dec 16 07:06:47 crc kubenswrapper[4789]: I1216 07:06:47.209323 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zk7nk" Dec 16 07:06:47 crc kubenswrapper[4789]: W1216 07:06:47.229476 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc6fbc9_7246_4b82_a97e_d3b09c3b57e6.slice/crio-bbaa90d687f0aa56e79ecf8c634754384b7c3b9d49479556d4f038904d23538c WatchSource:0}: Error finding container bbaa90d687f0aa56e79ecf8c634754384b7c3b9d49479556d4f038904d23538c: Status 404 returned error can't find the container with id bbaa90d687f0aa56e79ecf8c634754384b7c3b9d49479556d4f038904d23538c Dec 16 07:06:48 crc kubenswrapper[4789]: I1216 07:06:48.048556 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zk7nk" event={"ID":"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6","Type":"ContainerStarted","Data":"614327a6cf3fd880f5448985cd4aa438838e230c11aeb712f0fef73fb1ee68b5"} Dec 16 07:06:48 crc kubenswrapper[4789]: I1216 07:06:48.048938 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zk7nk" event={"ID":"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6","Type":"ContainerStarted","Data":"48313ae6582f7eab5b73b315de0421b2d25ad21111fc78557789d8de74af02e8"} Dec 16 07:06:48 crc kubenswrapper[4789]: I1216 07:06:48.048959 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:06:48 crc kubenswrapper[4789]: I1216 07:06:48.048972 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zk7nk" event={"ID":"fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6","Type":"ContainerStarted","Data":"bbaa90d687f0aa56e79ecf8c634754384b7c3b9d49479556d4f038904d23538c"} Dec 16 07:06:48 crc kubenswrapper[4789]: I1216 07:06:48.049168 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zk7nk" Dec 16 07:06:48 crc kubenswrapper[4789]: I1216 07:06:48.067248 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zk7nk" podStartSLOduration=3.067233454 podStartE2EDuration="3.067233454s" podCreationTimestamp="2025-12-16 07:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:06:48.064957929 +0000 UTC m=+946.326845558" watchObservedRunningTime="2025-12-16 07:06:48.067233454 +0000 UTC m=+946.329121083" Dec 16 07:06:49 crc kubenswrapper[4789]: I1216 07:06:49.055596 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ztph7" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="registry-server" containerID="cri-o://f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d" gracePeriod=2 Dec 16 07:06:49 crc kubenswrapper[4789]: I1216 07:06:49.986275 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.061714 4789 generic.go:334] "Generic (PLEG): container finished" podID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerID="f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d" exitCode=0 Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.061754 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztph7" event={"ID":"e697dbe4-6faf-419e-9dd1-79e1ec80ab34","Type":"ContainerDied","Data":"f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d"} Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.061761 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztph7" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.061783 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztph7" event={"ID":"e697dbe4-6faf-419e-9dd1-79e1ec80ab34","Type":"ContainerDied","Data":"ef53a8a0fc6ef1f36e9b18fcfe90d43824f755c713c9063868cd94e1bb900f6b"} Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.061799 4789 scope.go:117] "RemoveContainer" containerID="f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.074709 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-catalog-content\") pod \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.074801 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-utilities\") pod \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.074848 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg84b\" (UniqueName: \"kubernetes.io/projected/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-kube-api-access-xg84b\") pod \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\" (UID: \"e697dbe4-6faf-419e-9dd1-79e1ec80ab34\") " Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.075701 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-utilities" (OuterVolumeSpecName: "utilities") pod "e697dbe4-6faf-419e-9dd1-79e1ec80ab34" (UID: "e697dbe4-6faf-419e-9dd1-79e1ec80ab34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.087352 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-kube-api-access-xg84b" (OuterVolumeSpecName: "kube-api-access-xg84b") pod "e697dbe4-6faf-419e-9dd1-79e1ec80ab34" (UID: "e697dbe4-6faf-419e-9dd1-79e1ec80ab34"). InnerVolumeSpecName "kube-api-access-xg84b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.094185 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e697dbe4-6faf-419e-9dd1-79e1ec80ab34" (UID: "e697dbe4-6faf-419e-9dd1-79e1ec80ab34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.103233 4789 scope.go:117] "RemoveContainer" containerID="45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.132581 4789 scope.go:117] "RemoveContainer" containerID="feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.156966 4789 scope.go:117] "RemoveContainer" containerID="f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d" Dec 16 07:06:50 crc kubenswrapper[4789]: E1216 07:06:50.157472 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d\": container with ID starting with f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d not found: ID does not exist" containerID="f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.157502 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d"} err="failed to get container status \"f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d\": rpc error: code = NotFound desc = could not find container \"f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d\": container with ID starting with f382123e1274452c32670869af860d0c8755713121c0ce2e1ccad21399a0f26d not found: ID does not exist" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.157524 4789 scope.go:117] "RemoveContainer" containerID="45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a" Dec 16 07:06:50 crc kubenswrapper[4789]: E1216 07:06:50.157747 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a\": container with ID starting with 45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a not found: ID does not exist" containerID="45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.157772 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a"} err="failed to get container status \"45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a\": rpc error: code = NotFound desc = could not find container \"45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a\": container with ID starting with 45e94157724ea87db2811464aaa7a167efc0b52b7e6ff37f89a5a1a9f4a2897a not found: ID does not exist" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.157793 4789 scope.go:117] "RemoveContainer" containerID="feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b" Dec 16 07:06:50 crc kubenswrapper[4789]: E1216 07:06:50.158074 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b\": container with ID starting with feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b not found: ID does not exist" containerID="feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.158096 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b"} err="failed to get container status \"feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b\": rpc error: code = NotFound desc = could not find container \"feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b\": container with ID starting with feb3c00fadc66b67a892ec3067edce263046036eb451e7c0c49fe70042ba327b not found: ID does not exist" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.176556 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.176593 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.176603 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg84b\" (UniqueName: \"kubernetes.io/projected/e697dbe4-6faf-419e-9dd1-79e1ec80ab34-kube-api-access-xg84b\") on node \"crc\" DevicePath \"\"" Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.379401 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztph7"] Dec 16 07:06:50 crc kubenswrapper[4789]: I1216 07:06:50.389743 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztph7"] Dec 16 07:06:52 crc kubenswrapper[4789]: I1216 07:06:52.112603 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" path="/var/lib/kubelet/pods/e697dbe4-6faf-419e-9dd1-79e1ec80ab34/volumes" Dec 16 07:06:54 crc kubenswrapper[4789]: I1216 07:06:54.097098 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" event={"ID":"f0535b2c-a6bb-4092-b481-ccba194fd9b4","Type":"ContainerStarted","Data":"e3924d6da08da82d3987ed1ccb083e5097933d9c3e1e0015e68fdd741ef6b120"} Dec 16 07:06:55 crc kubenswrapper[4789]: I1216 07:06:55.105084 4789 generic.go:334] "Generic (PLEG): container finished" podID="d3ae31c7-d2d7-4a0f-b8eb-59d7be857994" containerID="0ef850a64baed423616a56390d7909d5743a779959ce525649da7926598220dd" exitCode=0 Dec 16 07:06:55 crc kubenswrapper[4789]: I1216 07:06:55.105188 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerDied","Data":"0ef850a64baed423616a56390d7909d5743a779959ce525649da7926598220dd"} Dec 16 07:06:55 crc kubenswrapper[4789]: I1216 07:06:55.105226 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:06:55 crc kubenswrapper[4789]: I1216 07:06:55.129846 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" podStartSLOduration=3.246301856 podStartE2EDuration="10.129813128s" podCreationTimestamp="2025-12-16 07:06:45 +0000 UTC" firstStartedPulling="2025-12-16 07:06:45.794196966 +0000 UTC m=+944.056084595" lastFinishedPulling="2025-12-16 07:06:52.677708238 +0000 UTC m=+950.939595867" observedRunningTime="2025-12-16 07:06:55.126501026 +0000 UTC m=+953.388388655" watchObservedRunningTime="2025-12-16 07:06:55.129813128 +0000 UTC m=+953.391700847" Dec 16 07:06:56 crc kubenswrapper[4789]: I1216 07:06:56.113961 4789 generic.go:334] "Generic (PLEG): container finished" podID="d3ae31c7-d2d7-4a0f-b8eb-59d7be857994" containerID="625108d24f0b81d20747a038e4eaba05d9b64ab57daa3f26e38d3e81b4288226" exitCode=0 Dec 16 07:06:56 crc kubenswrapper[4789]: I1216 07:06:56.114001 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerDied","Data":"625108d24f0b81d20747a038e4eaba05d9b64ab57daa3f26e38d3e81b4288226"} Dec 16 07:06:57 crc kubenswrapper[4789]: I1216 07:06:57.121112 4789 generic.go:334] "Generic (PLEG): container finished" podID="d3ae31c7-d2d7-4a0f-b8eb-59d7be857994" containerID="111aacd3fb98730f8cc65430d51f837c6be5cbcd4b10df116366d44d4ee8f5f2" exitCode=0 Dec 16 07:06:57 crc kubenswrapper[4789]: I1216 07:06:57.121161 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerDied","Data":"111aacd3fb98730f8cc65430d51f837c6be5cbcd4b10df116366d44d4ee8f5f2"} Dec 16 07:06:57 crc kubenswrapper[4789]: I1216 07:06:57.212981 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zk7nk" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.130746 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerStarted","Data":"f5cdb57bfaacc7bd6770df90505955afd2fa6be86bf6969e9b7ce02239a417b3"} Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.130788 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerStarted","Data":"85a70e6b9b643caefc36e98345826dc4a5b8d87eb72c360a842a43aa9f337ae7"} Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.130797 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerStarted","Data":"a46bbff7e6e306ba58ef970538de6f797bfce99c5fe99541fecd13394d22524e"} Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.130807 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerStarted","Data":"9e772ba3ca274b5f391227a701a0f1dfa2b2043cbfac2687972229ac6aeba886"} Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.130832 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerStarted","Data":"87b399e09d88ebc1881f4d36716a5c6bf6823880b1a2e2aa32cbf8a907bcb619"} Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.130840 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rfbh5" event={"ID":"d3ae31c7-d2d7-4a0f-b8eb-59d7be857994","Type":"ContainerStarted","Data":"5c5512bc8f887daab6288bfb69869da881ea64e44a8a0762f423f9c918fe5505"} Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.130954 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.158423 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rfbh5" podStartSLOduration=6.987576948 podStartE2EDuration="13.158402537s" podCreationTimestamp="2025-12-16 07:06:45 +0000 UTC" firstStartedPulling="2025-12-16 07:06:46.517587163 +0000 UTC m=+944.779474792" lastFinishedPulling="2025-12-16 07:06:52.688412752 +0000 UTC m=+950.950300381" observedRunningTime="2025-12-16 07:06:58.156669445 +0000 UTC m=+956.418557074" watchObservedRunningTime="2025-12-16 07:06:58.158402537 +0000 UTC m=+956.420290166" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.693495 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f"] Dec 16 07:06:58 crc kubenswrapper[4789]: E1216 07:06:58.693983 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="registry-server" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.693997 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="registry-server" Dec 16 07:06:58 crc kubenswrapper[4789]: E1216 07:06:58.694009 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="extract-content" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.694015 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="extract-content" Dec 16 07:06:58 crc kubenswrapper[4789]: E1216 07:06:58.694035 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="extract-utilities" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.694042 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="extract-utilities" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.694138 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e697dbe4-6faf-419e-9dd1-79e1ec80ab34" containerName="registry-server" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.694822 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.700263 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f"] Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.701678 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.801043 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.801113 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.801159 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49g4\" (UniqueName: \"kubernetes.io/projected/0f18cbd4-42d9-4b83-b929-1cb218f960b4-kube-api-access-j49g4\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.902032 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.902289 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.902410 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49g4\" (UniqueName: \"kubernetes.io/projected/0f18cbd4-42d9-4b83-b929-1cb218f960b4-kube-api-access-j49g4\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.903455 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.903809 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:58 crc kubenswrapper[4789]: I1216 07:06:58.923691 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49g4\" (UniqueName: \"kubernetes.io/projected/0f18cbd4-42d9-4b83-b929-1cb218f960b4-kube-api-access-j49g4\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:59 crc kubenswrapper[4789]: I1216 07:06:59.016778 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:06:59 crc kubenswrapper[4789]: I1216 07:06:59.185279 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f"] Dec 16 07:06:59 crc kubenswrapper[4789]: W1216 07:06:59.202162 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f18cbd4_42d9_4b83_b929_1cb218f960b4.slice/crio-7b483d514d6b85259057d07e9c0076074e0e44db449d73c5fa1e2d487047636b WatchSource:0}: Error finding container 7b483d514d6b85259057d07e9c0076074e0e44db449d73c5fa1e2d487047636b: Status 404 returned error can't find the container with id 7b483d514d6b85259057d07e9c0076074e0e44db449d73c5fa1e2d487047636b Dec 16 07:07:00 crc kubenswrapper[4789]: I1216 07:07:00.147881 4789 generic.go:334] "Generic (PLEG): container finished" podID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerID="3f2930df04dd2e933a66a8e1d0a96b161cdb667d585c74a28b3bfd3f26efa728" exitCode=0 Dec 16 07:07:00 crc kubenswrapper[4789]: I1216 07:07:00.147946 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" event={"ID":"0f18cbd4-42d9-4b83-b929-1cb218f960b4","Type":"ContainerDied","Data":"3f2930df04dd2e933a66a8e1d0a96b161cdb667d585c74a28b3bfd3f26efa728"} Dec 16 07:07:00 crc kubenswrapper[4789]: I1216 07:07:00.147966 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" event={"ID":"0f18cbd4-42d9-4b83-b929-1cb218f960b4","Type":"ContainerStarted","Data":"7b483d514d6b85259057d07e9c0076074e0e44db449d73c5fa1e2d487047636b"} Dec 16 07:07:00 crc kubenswrapper[4789]: I1216 07:07:00.581481 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:07:00 crc kubenswrapper[4789]: I1216 07:07:00.621852 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:07:04 crc kubenswrapper[4789]: I1216 07:07:04.176571 4789 generic.go:334] "Generic (PLEG): container finished" podID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerID="2b3baec94747f50704c33447f7546a1b7db2589bff6b4575e8a202c2ccd378d1" exitCode=0 Dec 16 07:07:04 crc kubenswrapper[4789]: I1216 07:07:04.176613 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" event={"ID":"0f18cbd4-42d9-4b83-b929-1cb218f960b4","Type":"ContainerDied","Data":"2b3baec94747f50704c33447f7546a1b7db2589bff6b4575e8a202c2ccd378d1"} Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.184641 4789 generic.go:334] "Generic (PLEG): container finished" podID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerID="3b4fcbfebf9bab2dc9598adc0b3fad8ac3466b2428abc60d2e112052e30e5a35" exitCode=0 Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.184682 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" event={"ID":"0f18cbd4-42d9-4b83-b929-1cb218f960b4","Type":"ContainerDied","Data":"3b4fcbfebf9bab2dc9598adc0b3fad8ac3466b2428abc60d2e112052e30e5a35"} Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.596048 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5spwp" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.653339 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xjtfk"] Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.655529 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.665207 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjtfk"] Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.746316 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-pjwhk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.789054 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-catalog-content\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.789108 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-utilities\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.789181 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxscv\" (UniqueName: \"kubernetes.io/projected/55a2e52c-c6db-48fb-b04d-0c29179aab45-kube-api-access-xxscv\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.889894 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxscv\" (UniqueName: \"kubernetes.io/projected/55a2e52c-c6db-48fb-b04d-0c29179aab45-kube-api-access-xxscv\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.890047 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-catalog-content\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.890084 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-utilities\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.890663 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-utilities\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.890988 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-catalog-content\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.925017 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxscv\" (UniqueName: \"kubernetes.io/projected/55a2e52c-c6db-48fb-b04d-0c29179aab45-kube-api-access-xxscv\") pod \"certified-operators-xjtfk\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:05 crc kubenswrapper[4789]: I1216 07:07:05.978315 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.445542 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjtfk"] Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.663954 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.859117 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-bundle\") pod \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.859177 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j49g4\" (UniqueName: \"kubernetes.io/projected/0f18cbd4-42d9-4b83-b929-1cb218f960b4-kube-api-access-j49g4\") pod \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.859255 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-util\") pod \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\" (UID: \"0f18cbd4-42d9-4b83-b929-1cb218f960b4\") " Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.859974 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-bundle" (OuterVolumeSpecName: "bundle") pod "0f18cbd4-42d9-4b83-b929-1cb218f960b4" (UID: "0f18cbd4-42d9-4b83-b929-1cb218f960b4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.865158 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f18cbd4-42d9-4b83-b929-1cb218f960b4-kube-api-access-j49g4" (OuterVolumeSpecName: "kube-api-access-j49g4") pod "0f18cbd4-42d9-4b83-b929-1cb218f960b4" (UID: "0f18cbd4-42d9-4b83-b929-1cb218f960b4"). InnerVolumeSpecName "kube-api-access-j49g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.869456 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-util" (OuterVolumeSpecName: "util") pod "0f18cbd4-42d9-4b83-b929-1cb218f960b4" (UID: "0f18cbd4-42d9-4b83-b929-1cb218f960b4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.960492 4789 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.960573 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j49g4\" (UniqueName: \"kubernetes.io/projected/0f18cbd4-42d9-4b83-b929-1cb218f960b4-kube-api-access-j49g4\") on node \"crc\" DevicePath \"\"" Dec 16 07:07:06 crc kubenswrapper[4789]: I1216 07:07:06.960591 4789 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f18cbd4-42d9-4b83-b929-1cb218f960b4-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:07:07 crc kubenswrapper[4789]: I1216 07:07:07.214942 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" event={"ID":"0f18cbd4-42d9-4b83-b929-1cb218f960b4","Type":"ContainerDied","Data":"7b483d514d6b85259057d07e9c0076074e0e44db449d73c5fa1e2d487047636b"} Dec 16 07:07:07 crc kubenswrapper[4789]: I1216 07:07:07.215652 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b483d514d6b85259057d07e9c0076074e0e44db449d73c5fa1e2d487047636b" Dec 16 07:07:07 crc kubenswrapper[4789]: I1216 07:07:07.214983 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f" Dec 16 07:07:07 crc kubenswrapper[4789]: I1216 07:07:07.216938 4789 generic.go:334] "Generic (PLEG): container finished" podID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerID="48caff0dab348a27f59796ffeeb8cfaa1a533e123c633e51fcdd603101427a8c" exitCode=0 Dec 16 07:07:07 crc kubenswrapper[4789]: I1216 07:07:07.216991 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjtfk" event={"ID":"55a2e52c-c6db-48fb-b04d-0c29179aab45","Type":"ContainerDied","Data":"48caff0dab348a27f59796ffeeb8cfaa1a533e123c633e51fcdd603101427a8c"} Dec 16 07:07:07 crc kubenswrapper[4789]: I1216 07:07:07.217019 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjtfk" event={"ID":"55a2e52c-c6db-48fb-b04d-0c29179aab45","Type":"ContainerStarted","Data":"beb4bed6ec9c79246102e3bc7629844ca7abba8a9f41142f2e79d68d29165972"} Dec 16 07:07:08 crc kubenswrapper[4789]: I1216 07:07:08.223744 4789 generic.go:334] "Generic (PLEG): container finished" podID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerID="4030785d98af4a23461c43a3eb30e00202fca73ad4112b0ed8916a1e6be5f970" exitCode=0 Dec 16 07:07:08 crc kubenswrapper[4789]: I1216 07:07:08.223786 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjtfk" event={"ID":"55a2e52c-c6db-48fb-b04d-0c29179aab45","Type":"ContainerDied","Data":"4030785d98af4a23461c43a3eb30e00202fca73ad4112b0ed8916a1e6be5f970"} Dec 16 07:07:09 crc kubenswrapper[4789]: I1216 07:07:09.237214 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjtfk" event={"ID":"55a2e52c-c6db-48fb-b04d-0c29179aab45","Type":"ContainerStarted","Data":"5c9f4d45d295e0d20ca525c882a423c528138ea972990698f6da7ef98e5a0ede"} Dec 16 07:07:09 crc kubenswrapper[4789]: I1216 07:07:09.262413 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xjtfk" podStartSLOduration=2.866256847 podStartE2EDuration="4.262398386s" podCreationTimestamp="2025-12-16 07:07:05 +0000 UTC" firstStartedPulling="2025-12-16 07:07:07.219040096 +0000 UTC m=+965.480927725" lastFinishedPulling="2025-12-16 07:07:08.615181635 +0000 UTC m=+966.877069264" observedRunningTime="2025-12-16 07:07:09.261036573 +0000 UTC m=+967.522924202" watchObservedRunningTime="2025-12-16 07:07:09.262398386 +0000 UTC m=+967.524286005" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.815127 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w"] Dec 16 07:07:11 crc kubenswrapper[4789]: E1216 07:07:11.815778 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerName="extract" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.815797 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerName="extract" Dec 16 07:07:11 crc kubenswrapper[4789]: E1216 07:07:11.815830 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerName="util" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.815841 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerName="util" Dec 16 07:07:11 crc kubenswrapper[4789]: E1216 07:07:11.815864 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerName="pull" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.815875 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerName="pull" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.816086 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f18cbd4-42d9-4b83-b929-1cb218f960b4" containerName="extract" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.816666 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.819072 4789 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-vv5r8" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.819084 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.819789 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.834560 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w"] Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.838023 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6mk\" (UniqueName: \"kubernetes.io/projected/d3e16e1e-17b1-4c04-9657-d72012b7c2ba-kube-api-access-zz6mk\") pod \"cert-manager-operator-controller-manager-64cf6dff88-k999w\" (UID: \"d3e16e1e-17b1-4c04-9657-d72012b7c2ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.838112 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3e16e1e-17b1-4c04-9657-d72012b7c2ba-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-k999w\" (UID: \"d3e16e1e-17b1-4c04-9657-d72012b7c2ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.938671 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6mk\" (UniqueName: \"kubernetes.io/projected/d3e16e1e-17b1-4c04-9657-d72012b7c2ba-kube-api-access-zz6mk\") pod \"cert-manager-operator-controller-manager-64cf6dff88-k999w\" (UID: \"d3e16e1e-17b1-4c04-9657-d72012b7c2ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.938769 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3e16e1e-17b1-4c04-9657-d72012b7c2ba-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-k999w\" (UID: \"d3e16e1e-17b1-4c04-9657-d72012b7c2ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.939274 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3e16e1e-17b1-4c04-9657-d72012b7c2ba-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-k999w\" (UID: \"d3e16e1e-17b1-4c04-9657-d72012b7c2ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:11 crc kubenswrapper[4789]: I1216 07:07:11.959406 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6mk\" (UniqueName: \"kubernetes.io/projected/d3e16e1e-17b1-4c04-9657-d72012b7c2ba-kube-api-access-zz6mk\") pod \"cert-manager-operator-controller-manager-64cf6dff88-k999w\" (UID: \"d3e16e1e-17b1-4c04-9657-d72012b7c2ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:12 crc kubenswrapper[4789]: I1216 07:07:12.130860 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" Dec 16 07:07:12 crc kubenswrapper[4789]: I1216 07:07:12.391307 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w"] Dec 16 07:07:13 crc kubenswrapper[4789]: I1216 07:07:13.276255 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" event={"ID":"d3e16e1e-17b1-4c04-9657-d72012b7c2ba","Type":"ContainerStarted","Data":"c1a2573d663eb082767568e7eaedfce07e79af6b91a7062cf8727a179ba903dc"} Dec 16 07:07:15 crc kubenswrapper[4789]: I1216 07:07:15.587576 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rfbh5" Dec 16 07:07:15 crc kubenswrapper[4789]: I1216 07:07:15.978542 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:15 crc kubenswrapper[4789]: I1216 07:07:15.978813 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:16 crc kubenswrapper[4789]: I1216 07:07:16.017798 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:16 crc kubenswrapper[4789]: I1216 07:07:16.429978 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:18 crc kubenswrapper[4789]: I1216 07:07:18.438510 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjtfk"] Dec 16 07:07:18 crc kubenswrapper[4789]: I1216 07:07:18.439121 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xjtfk" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="registry-server" containerID="cri-o://5c9f4d45d295e0d20ca525c882a423c528138ea972990698f6da7ef98e5a0ede" gracePeriod=2 Dec 16 07:07:19 crc kubenswrapper[4789]: I1216 07:07:19.343741 4789 generic.go:334] "Generic (PLEG): container finished" podID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerID="5c9f4d45d295e0d20ca525c882a423c528138ea972990698f6da7ef98e5a0ede" exitCode=0 Dec 16 07:07:19 crc kubenswrapper[4789]: I1216 07:07:19.343787 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjtfk" event={"ID":"55a2e52c-c6db-48fb-b04d-0c29179aab45","Type":"ContainerDied","Data":"5c9f4d45d295e0d20ca525c882a423c528138ea972990698f6da7ef98e5a0ede"} Dec 16 07:07:21 crc kubenswrapper[4789]: I1216 07:07:21.949509 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.097748 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-catalog-content\") pod \"55a2e52c-c6db-48fb-b04d-0c29179aab45\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.098143 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxscv\" (UniqueName: \"kubernetes.io/projected/55a2e52c-c6db-48fb-b04d-0c29179aab45-kube-api-access-xxscv\") pod \"55a2e52c-c6db-48fb-b04d-0c29179aab45\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.098184 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-utilities\") pod \"55a2e52c-c6db-48fb-b04d-0c29179aab45\" (UID: \"55a2e52c-c6db-48fb-b04d-0c29179aab45\") " Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.098995 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-utilities" (OuterVolumeSpecName: "utilities") pod "55a2e52c-c6db-48fb-b04d-0c29179aab45" (UID: "55a2e52c-c6db-48fb-b04d-0c29179aab45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.105546 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a2e52c-c6db-48fb-b04d-0c29179aab45-kube-api-access-xxscv" (OuterVolumeSpecName: "kube-api-access-xxscv") pod "55a2e52c-c6db-48fb-b04d-0c29179aab45" (UID: "55a2e52c-c6db-48fb-b04d-0c29179aab45"). InnerVolumeSpecName "kube-api-access-xxscv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.157129 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55a2e52c-c6db-48fb-b04d-0c29179aab45" (UID: "55a2e52c-c6db-48fb-b04d-0c29179aab45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.198974 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.199007 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a2e52c-c6db-48fb-b04d-0c29179aab45-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.199017 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxscv\" (UniqueName: \"kubernetes.io/projected/55a2e52c-c6db-48fb-b04d-0c29179aab45-kube-api-access-xxscv\") on node \"crc\" DevicePath \"\"" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.404969 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" event={"ID":"d3e16e1e-17b1-4c04-9657-d72012b7c2ba","Type":"ContainerStarted","Data":"6abf3f14f78ac69e41ce6971fe30fc48af6ab26eca0f71a9152452668fd956c8"} Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.407868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjtfk" event={"ID":"55a2e52c-c6db-48fb-b04d-0c29179aab45","Type":"ContainerDied","Data":"beb4bed6ec9c79246102e3bc7629844ca7abba8a9f41142f2e79d68d29165972"} Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.407913 4789 scope.go:117] "RemoveContainer" containerID="5c9f4d45d295e0d20ca525c882a423c528138ea972990698f6da7ef98e5a0ede" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.407977 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjtfk" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.426813 4789 scope.go:117] "RemoveContainer" containerID="4030785d98af4a23461c43a3eb30e00202fca73ad4112b0ed8916a1e6be5f970" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.429345 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-k999w" podStartSLOduration=1.853989993 podStartE2EDuration="11.429267839s" podCreationTimestamp="2025-12-16 07:07:11 +0000 UTC" firstStartedPulling="2025-12-16 07:07:12.4340926 +0000 UTC m=+970.695980229" lastFinishedPulling="2025-12-16 07:07:22.009370446 +0000 UTC m=+980.271258075" observedRunningTime="2025-12-16 07:07:22.427162307 +0000 UTC m=+980.689049936" watchObservedRunningTime="2025-12-16 07:07:22.429267839 +0000 UTC m=+980.691155498" Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.446458 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xjtfk"] Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.463193 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xjtfk"] Dec 16 07:07:22 crc kubenswrapper[4789]: I1216 07:07:22.466629 4789 scope.go:117] "RemoveContainer" containerID="48caff0dab348a27f59796ffeeb8cfaa1a533e123c633e51fcdd603101427a8c" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.142752 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" path="/var/lib/kubelet/pods/55a2e52c-c6db-48fb-b04d-0c29179aab45/volumes" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.320809 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-vtvzn"] Dec 16 07:07:24 crc kubenswrapper[4789]: E1216 07:07:24.321053 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="extract-content" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.321069 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="extract-content" Dec 16 07:07:24 crc kubenswrapper[4789]: E1216 07:07:24.321081 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="extract-utilities" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.321088 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="extract-utilities" Dec 16 07:07:24 crc kubenswrapper[4789]: E1216 07:07:24.321105 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="registry-server" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.321111 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="registry-server" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.321233 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a2e52c-c6db-48fb-b04d-0c29179aab45" containerName="registry-server" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.321673 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.323509 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.323607 4789 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2bd5d" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.323781 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.326954 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n84t\" (UniqueName: \"kubernetes.io/projected/3c23c454-c4d1-4e67-bd4a-69e1014e5a5c-kube-api-access-7n84t\") pod \"cert-manager-webhook-f4fb5df64-vtvzn\" (UID: \"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.327037 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c23c454-c4d1-4e67-bd4a-69e1014e5a5c-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-vtvzn\" (UID: \"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.342974 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-vtvzn"] Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.427946 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n84t\" (UniqueName: \"kubernetes.io/projected/3c23c454-c4d1-4e67-bd4a-69e1014e5a5c-kube-api-access-7n84t\") pod \"cert-manager-webhook-f4fb5df64-vtvzn\" (UID: \"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.427985 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c23c454-c4d1-4e67-bd4a-69e1014e5a5c-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-vtvzn\" (UID: \"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.447420 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n84t\" (UniqueName: \"kubernetes.io/projected/3c23c454-c4d1-4e67-bd4a-69e1014e5a5c-kube-api-access-7n84t\") pod \"cert-manager-webhook-f4fb5df64-vtvzn\" (UID: \"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.448035 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c23c454-c4d1-4e67-bd4a-69e1014e5a5c-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-vtvzn\" (UID: \"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:24 crc kubenswrapper[4789]: I1216 07:07:24.638841 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:25 crc kubenswrapper[4789]: I1216 07:07:25.186578 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-vtvzn"] Dec 16 07:07:25 crc kubenswrapper[4789]: I1216 07:07:25.428288 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" event={"ID":"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c","Type":"ContainerStarted","Data":"ea6e02475cc1835d8af542b59ed3f66b13833433bdfcc7f6ca1cbf20fd08da2b"} Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.137543 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf"] Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.141606 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.142535 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf"] Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.144231 4789 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5vmfj" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.288134 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gthm4\" (UniqueName: \"kubernetes.io/projected/e8660c70-e0a3-4c56-aff9-eccfb4fa297d-kube-api-access-gthm4\") pod \"cert-manager-cainjector-855d9ccff4-xnqdf\" (UID: \"e8660c70-e0a3-4c56-aff9-eccfb4fa297d\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.288498 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8660c70-e0a3-4c56-aff9-eccfb4fa297d-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-xnqdf\" (UID: \"e8660c70-e0a3-4c56-aff9-eccfb4fa297d\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.390274 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gthm4\" (UniqueName: \"kubernetes.io/projected/e8660c70-e0a3-4c56-aff9-eccfb4fa297d-kube-api-access-gthm4\") pod \"cert-manager-cainjector-855d9ccff4-xnqdf\" (UID: \"e8660c70-e0a3-4c56-aff9-eccfb4fa297d\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.390313 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8660c70-e0a3-4c56-aff9-eccfb4fa297d-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-xnqdf\" (UID: \"e8660c70-e0a3-4c56-aff9-eccfb4fa297d\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.411512 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gthm4\" (UniqueName: \"kubernetes.io/projected/e8660c70-e0a3-4c56-aff9-eccfb4fa297d-kube-api-access-gthm4\") pod \"cert-manager-cainjector-855d9ccff4-xnqdf\" (UID: \"e8660c70-e0a3-4c56-aff9-eccfb4fa297d\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.411891 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8660c70-e0a3-4c56-aff9-eccfb4fa297d-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-xnqdf\" (UID: \"e8660c70-e0a3-4c56-aff9-eccfb4fa297d\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.467575 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" Dec 16 07:07:28 crc kubenswrapper[4789]: I1216 07:07:28.862871 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf"] Dec 16 07:07:29 crc kubenswrapper[4789]: I1216 07:07:29.460419 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" event={"ID":"e8660c70-e0a3-4c56-aff9-eccfb4fa297d","Type":"ContainerStarted","Data":"6aaf8176dd239c59e8fe81291301ad570fd7721e426fa7220dfef0128121ff5e"} Dec 16 07:07:34 crc kubenswrapper[4789]: I1216 07:07:34.502554 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" event={"ID":"3c23c454-c4d1-4e67-bd4a-69e1014e5a5c","Type":"ContainerStarted","Data":"26fa7eb9edd7bf8b86fb4ced5d00ebb477c0660f18a53a31b8b0bae75e49f830"} Dec 16 07:07:34 crc kubenswrapper[4789]: I1216 07:07:34.503264 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:34 crc kubenswrapper[4789]: I1216 07:07:34.504793 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" event={"ID":"e8660c70-e0a3-4c56-aff9-eccfb4fa297d","Type":"ContainerStarted","Data":"972097fcde322c978ed9bc562ad733c0a36798fee47beec6466a345cae71a62e"} Dec 16 07:07:34 crc kubenswrapper[4789]: I1216 07:07:34.521476 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" podStartSLOduration=1.698335951 podStartE2EDuration="10.52145789s" podCreationTimestamp="2025-12-16 07:07:24 +0000 UTC" firstStartedPulling="2025-12-16 07:07:25.195974207 +0000 UTC m=+983.457861836" lastFinishedPulling="2025-12-16 07:07:34.019096136 +0000 UTC m=+992.280983775" observedRunningTime="2025-12-16 07:07:34.516456297 +0000 UTC m=+992.778343926" watchObservedRunningTime="2025-12-16 07:07:34.52145789 +0000 UTC m=+992.783345519" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.253298 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xnqdf" podStartSLOduration=5.12073007 podStartE2EDuration="10.253274442s" podCreationTimestamp="2025-12-16 07:07:28 +0000 UTC" firstStartedPulling="2025-12-16 07:07:28.87220141 +0000 UTC m=+987.134089039" lastFinishedPulling="2025-12-16 07:07:34.004745782 +0000 UTC m=+992.266633411" observedRunningTime="2025-12-16 07:07:34.53890463 +0000 UTC m=+992.800792259" watchObservedRunningTime="2025-12-16 07:07:38.253274442 +0000 UTC m=+996.515162071" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.255745 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-6jrv2"] Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.256549 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.259240 4789 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wjlmv" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.274887 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-6jrv2"] Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.423841 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjpfn\" (UniqueName: \"kubernetes.io/projected/b457f25c-782e-4215-9e13-afcbf2c32dc6-kube-api-access-wjpfn\") pod \"cert-manager-86cb77c54b-6jrv2\" (UID: \"b457f25c-782e-4215-9e13-afcbf2c32dc6\") " pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.423928 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b457f25c-782e-4215-9e13-afcbf2c32dc6-bound-sa-token\") pod \"cert-manager-86cb77c54b-6jrv2\" (UID: \"b457f25c-782e-4215-9e13-afcbf2c32dc6\") " pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.525567 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjpfn\" (UniqueName: \"kubernetes.io/projected/b457f25c-782e-4215-9e13-afcbf2c32dc6-kube-api-access-wjpfn\") pod \"cert-manager-86cb77c54b-6jrv2\" (UID: \"b457f25c-782e-4215-9e13-afcbf2c32dc6\") " pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.526352 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b457f25c-782e-4215-9e13-afcbf2c32dc6-bound-sa-token\") pod \"cert-manager-86cb77c54b-6jrv2\" (UID: \"b457f25c-782e-4215-9e13-afcbf2c32dc6\") " pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.543489 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b457f25c-782e-4215-9e13-afcbf2c32dc6-bound-sa-token\") pod \"cert-manager-86cb77c54b-6jrv2\" (UID: \"b457f25c-782e-4215-9e13-afcbf2c32dc6\") " pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.544450 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjpfn\" (UniqueName: \"kubernetes.io/projected/b457f25c-782e-4215-9e13-afcbf2c32dc6-kube-api-access-wjpfn\") pod \"cert-manager-86cb77c54b-6jrv2\" (UID: \"b457f25c-782e-4215-9e13-afcbf2c32dc6\") " pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.575662 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-6jrv2" Dec 16 07:07:38 crc kubenswrapper[4789]: I1216 07:07:38.970141 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-6jrv2"] Dec 16 07:07:39 crc kubenswrapper[4789]: I1216 07:07:39.532244 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-6jrv2" event={"ID":"b457f25c-782e-4215-9e13-afcbf2c32dc6","Type":"ContainerStarted","Data":"7759b8a3cedd577727dc824b4b297e09ae3dc82aea99026aa618a298fc0480c7"} Dec 16 07:07:39 crc kubenswrapper[4789]: I1216 07:07:39.532295 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-6jrv2" event={"ID":"b457f25c-782e-4215-9e13-afcbf2c32dc6","Type":"ContainerStarted","Data":"ac9988fbc2c6d67646a10dfefade423d0b64014f65d761f198633b592ac134ac"} Dec 16 07:07:39 crc kubenswrapper[4789]: I1216 07:07:39.549437 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-6jrv2" podStartSLOduration=1.549414657 podStartE2EDuration="1.549414657s" podCreationTimestamp="2025-12-16 07:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:07:39.545243824 +0000 UTC m=+997.807131453" watchObservedRunningTime="2025-12-16 07:07:39.549414657 +0000 UTC m=+997.811302286" Dec 16 07:07:39 crc kubenswrapper[4789]: I1216 07:07:39.641575 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-vtvzn" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.262049 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wfzl5"] Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.263260 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wfzl5" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.265526 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.266711 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.266721 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hm644" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.267761 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wfzl5"] Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.423036 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx6g8\" (UniqueName: \"kubernetes.io/projected/6b946a67-f812-4a36-991d-49fdd54e99b6-kube-api-access-xx6g8\") pod \"openstack-operator-index-wfzl5\" (UID: \"6b946a67-f812-4a36-991d-49fdd54e99b6\") " pod="openstack-operators/openstack-operator-index-wfzl5" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.524721 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx6g8\" (UniqueName: \"kubernetes.io/projected/6b946a67-f812-4a36-991d-49fdd54e99b6-kube-api-access-xx6g8\") pod \"openstack-operator-index-wfzl5\" (UID: \"6b946a67-f812-4a36-991d-49fdd54e99b6\") " pod="openstack-operators/openstack-operator-index-wfzl5" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.545044 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx6g8\" (UniqueName: \"kubernetes.io/projected/6b946a67-f812-4a36-991d-49fdd54e99b6-kube-api-access-xx6g8\") pod \"openstack-operator-index-wfzl5\" (UID: \"6b946a67-f812-4a36-991d-49fdd54e99b6\") " pod="openstack-operators/openstack-operator-index-wfzl5" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.581608 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wfzl5" Dec 16 07:07:46 crc kubenswrapper[4789]: I1216 07:07:46.968095 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wfzl5"] Dec 16 07:07:47 crc kubenswrapper[4789]: I1216 07:07:47.581185 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wfzl5" event={"ID":"6b946a67-f812-4a36-991d-49fdd54e99b6","Type":"ContainerStarted","Data":"f914c39e6f607a2db4b2d66e76e425b7459b5399204b81201189f9e6bba5e321"} Dec 16 07:07:48 crc kubenswrapper[4789]: I1216 07:07:48.589174 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wfzl5" event={"ID":"6b946a67-f812-4a36-991d-49fdd54e99b6","Type":"ContainerStarted","Data":"52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427"} Dec 16 07:07:48 crc kubenswrapper[4789]: I1216 07:07:48.603699 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wfzl5" podStartSLOduration=1.711712038 podStartE2EDuration="2.603677968s" podCreationTimestamp="2025-12-16 07:07:46 +0000 UTC" firstStartedPulling="2025-12-16 07:07:46.979863641 +0000 UTC m=+1005.241751270" lastFinishedPulling="2025-12-16 07:07:47.871829571 +0000 UTC m=+1006.133717200" observedRunningTime="2025-12-16 07:07:48.601243058 +0000 UTC m=+1006.863130707" watchObservedRunningTime="2025-12-16 07:07:48.603677968 +0000 UTC m=+1006.865565617" Dec 16 07:07:51 crc kubenswrapper[4789]: I1216 07:07:51.442842 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wfzl5"] Dec 16 07:07:51 crc kubenswrapper[4789]: I1216 07:07:51.443297 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wfzl5" podUID="6b946a67-f812-4a36-991d-49fdd54e99b6" containerName="registry-server" containerID="cri-o://52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427" gracePeriod=2 Dec 16 07:07:51 crc kubenswrapper[4789]: I1216 07:07:51.927776 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:07:51 crc kubenswrapper[4789]: I1216 07:07:51.927849 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.043304 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z5pmx"] Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.043982 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.051413 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z5pmx"] Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.191675 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6f9p\" (UniqueName: \"kubernetes.io/projected/424c1bbc-8b15-4d1e-8988-9f514926d253-kube-api-access-v6f9p\") pod \"openstack-operator-index-z5pmx\" (UID: \"424c1bbc-8b15-4d1e-8988-9f514926d253\") " pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.293924 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6f9p\" (UniqueName: \"kubernetes.io/projected/424c1bbc-8b15-4d1e-8988-9f514926d253-kube-api-access-v6f9p\") pod \"openstack-operator-index-z5pmx\" (UID: \"424c1bbc-8b15-4d1e-8988-9f514926d253\") " pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.315725 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wfzl5" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.318521 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6f9p\" (UniqueName: \"kubernetes.io/projected/424c1bbc-8b15-4d1e-8988-9f514926d253-kube-api-access-v6f9p\") pod \"openstack-operator-index-z5pmx\" (UID: \"424c1bbc-8b15-4d1e-8988-9f514926d253\") " pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.357491 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.495424 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx6g8\" (UniqueName: \"kubernetes.io/projected/6b946a67-f812-4a36-991d-49fdd54e99b6-kube-api-access-xx6g8\") pod \"6b946a67-f812-4a36-991d-49fdd54e99b6\" (UID: \"6b946a67-f812-4a36-991d-49fdd54e99b6\") " Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.499209 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b946a67-f812-4a36-991d-49fdd54e99b6-kube-api-access-xx6g8" (OuterVolumeSpecName: "kube-api-access-xx6g8") pod "6b946a67-f812-4a36-991d-49fdd54e99b6" (UID: "6b946a67-f812-4a36-991d-49fdd54e99b6"). InnerVolumeSpecName "kube-api-access-xx6g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.597359 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx6g8\" (UniqueName: \"kubernetes.io/projected/6b946a67-f812-4a36-991d-49fdd54e99b6-kube-api-access-xx6g8\") on node \"crc\" DevicePath \"\"" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.615212 4789 generic.go:334] "Generic (PLEG): container finished" podID="6b946a67-f812-4a36-991d-49fdd54e99b6" containerID="52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427" exitCode=0 Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.615243 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wfzl5" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.615253 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wfzl5" event={"ID":"6b946a67-f812-4a36-991d-49fdd54e99b6","Type":"ContainerDied","Data":"52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427"} Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.615375 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wfzl5" event={"ID":"6b946a67-f812-4a36-991d-49fdd54e99b6","Type":"ContainerDied","Data":"f914c39e6f607a2db4b2d66e76e425b7459b5399204b81201189f9e6bba5e321"} Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.615398 4789 scope.go:117] "RemoveContainer" containerID="52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.638720 4789 scope.go:117] "RemoveContainer" containerID="52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427" Dec 16 07:07:52 crc kubenswrapper[4789]: E1216 07:07:52.639551 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427\": container with ID starting with 52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427 not found: ID does not exist" containerID="52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.639581 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427"} err="failed to get container status \"52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427\": rpc error: code = NotFound desc = could not find container \"52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427\": container with ID starting with 52032ca3d07b0e80e227deb2f58334111d0dabf3feee117348c5119ddd78c427 not found: ID does not exist" Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.642902 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wfzl5"] Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.646366 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wfzl5"] Dec 16 07:07:52 crc kubenswrapper[4789]: I1216 07:07:52.741801 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z5pmx"] Dec 16 07:07:52 crc kubenswrapper[4789]: W1216 07:07:52.745546 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod424c1bbc_8b15_4d1e_8988_9f514926d253.slice/crio-e413e013ddfcf1d6022060c71f82ead2520b069b7e7f87203fd54e2fd735b177 WatchSource:0}: Error finding container e413e013ddfcf1d6022060c71f82ead2520b069b7e7f87203fd54e2fd735b177: Status 404 returned error can't find the container with id e413e013ddfcf1d6022060c71f82ead2520b069b7e7f87203fd54e2fd735b177 Dec 16 07:07:53 crc kubenswrapper[4789]: I1216 07:07:53.632367 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5pmx" event={"ID":"424c1bbc-8b15-4d1e-8988-9f514926d253","Type":"ContainerStarted","Data":"4e93fc1ffa0b368f20acc2576368788105bf718ca9765bc2f900fbeea4f731c1"} Dec 16 07:07:53 crc kubenswrapper[4789]: I1216 07:07:53.632686 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5pmx" event={"ID":"424c1bbc-8b15-4d1e-8988-9f514926d253","Type":"ContainerStarted","Data":"e413e013ddfcf1d6022060c71f82ead2520b069b7e7f87203fd54e2fd735b177"} Dec 16 07:07:53 crc kubenswrapper[4789]: I1216 07:07:53.658069 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z5pmx" podStartSLOduration=1.243007513 podStartE2EDuration="1.658048056s" podCreationTimestamp="2025-12-16 07:07:52 +0000 UTC" firstStartedPulling="2025-12-16 07:07:52.747856566 +0000 UTC m=+1011.009744195" lastFinishedPulling="2025-12-16 07:07:53.162897109 +0000 UTC m=+1011.424784738" observedRunningTime="2025-12-16 07:07:53.654269003 +0000 UTC m=+1011.916156642" watchObservedRunningTime="2025-12-16 07:07:53.658048056 +0000 UTC m=+1011.919935715" Dec 16 07:07:54 crc kubenswrapper[4789]: I1216 07:07:54.111545 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b946a67-f812-4a36-991d-49fdd54e99b6" path="/var/lib/kubelet/pods/6b946a67-f812-4a36-991d-49fdd54e99b6/volumes" Dec 16 07:08:02 crc kubenswrapper[4789]: I1216 07:08:02.357924 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:08:02 crc kubenswrapper[4789]: I1216 07:08:02.358480 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:08:02 crc kubenswrapper[4789]: I1216 07:08:02.387619 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:08:02 crc kubenswrapper[4789]: I1216 07:08:02.711709 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-z5pmx" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.482033 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j"] Dec 16 07:08:16 crc kubenswrapper[4789]: E1216 07:08:16.482722 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b946a67-f812-4a36-991d-49fdd54e99b6" containerName="registry-server" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.482734 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b946a67-f812-4a36-991d-49fdd54e99b6" containerName="registry-server" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.482838 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b946a67-f812-4a36-991d-49fdd54e99b6" containerName="registry-server" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.483627 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.485415 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j2qsg" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.506778 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j"] Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.536873 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-util\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.536999 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqvz\" (UniqueName: \"kubernetes.io/projected/e147cdc2-fabb-407d-992a-d4a654c09fa2-kube-api-access-cvqvz\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.537193 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-bundle\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.638709 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-bundle\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.638797 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-util\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.638847 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqvz\" (UniqueName: \"kubernetes.io/projected/e147cdc2-fabb-407d-992a-d4a654c09fa2-kube-api-access-cvqvz\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.639365 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-util\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.640091 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-bundle\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.672523 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqvz\" (UniqueName: \"kubernetes.io/projected/e147cdc2-fabb-407d-992a-d4a654c09fa2-kube-api-access-cvqvz\") pod \"d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:16 crc kubenswrapper[4789]: I1216 07:08:16.814792 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:17 crc kubenswrapper[4789]: I1216 07:08:17.187146 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j"] Dec 16 07:08:17 crc kubenswrapper[4789]: I1216 07:08:17.786725 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" event={"ID":"e147cdc2-fabb-407d-992a-d4a654c09fa2","Type":"ContainerStarted","Data":"5751d8afbf7458223e5aa21180ce6ef0134547e52a65551f584da1d61996ed38"} Dec 16 07:08:19 crc kubenswrapper[4789]: I1216 07:08:19.800554 4789 generic.go:334] "Generic (PLEG): container finished" podID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerID="f8dde8d48fae82e51191e3bf530713e2383c0cf0333010bb03ce4c862439cc07" exitCode=0 Dec 16 07:08:19 crc kubenswrapper[4789]: I1216 07:08:19.800644 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" event={"ID":"e147cdc2-fabb-407d-992a-d4a654c09fa2","Type":"ContainerDied","Data":"f8dde8d48fae82e51191e3bf530713e2383c0cf0333010bb03ce4c862439cc07"} Dec 16 07:08:21 crc kubenswrapper[4789]: I1216 07:08:21.816221 4789 generic.go:334] "Generic (PLEG): container finished" podID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerID="3aebd91b1e6502282836bdde02c3e8ed34f9bd450a2096109fd5f6c6340d7438" exitCode=0 Dec 16 07:08:21 crc kubenswrapper[4789]: I1216 07:08:21.816306 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" event={"ID":"e147cdc2-fabb-407d-992a-d4a654c09fa2","Type":"ContainerDied","Data":"3aebd91b1e6502282836bdde02c3e8ed34f9bd450a2096109fd5f6c6340d7438"} Dec 16 07:08:21 crc kubenswrapper[4789]: I1216 07:08:21.927803 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:08:21 crc kubenswrapper[4789]: I1216 07:08:21.927885 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:08:22 crc kubenswrapper[4789]: I1216 07:08:22.827241 4789 generic.go:334] "Generic (PLEG): container finished" podID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerID="5ca3c5e5ddb7556a4a516525d560ee5889233d202e89672b9db08c11fb260fda" exitCode=0 Dec 16 07:08:22 crc kubenswrapper[4789]: I1216 07:08:22.827301 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" event={"ID":"e147cdc2-fabb-407d-992a-d4a654c09fa2","Type":"ContainerDied","Data":"5ca3c5e5ddb7556a4a516525d560ee5889233d202e89672b9db08c11fb260fda"} Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.062057 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.139991 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvqvz\" (UniqueName: \"kubernetes.io/projected/e147cdc2-fabb-407d-992a-d4a654c09fa2-kube-api-access-cvqvz\") pod \"e147cdc2-fabb-407d-992a-d4a654c09fa2\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.140386 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-bundle\") pod \"e147cdc2-fabb-407d-992a-d4a654c09fa2\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.140454 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-util\") pod \"e147cdc2-fabb-407d-992a-d4a654c09fa2\" (UID: \"e147cdc2-fabb-407d-992a-d4a654c09fa2\") " Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.141628 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-bundle" (OuterVolumeSpecName: "bundle") pod "e147cdc2-fabb-407d-992a-d4a654c09fa2" (UID: "e147cdc2-fabb-407d-992a-d4a654c09fa2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.146229 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e147cdc2-fabb-407d-992a-d4a654c09fa2-kube-api-access-cvqvz" (OuterVolumeSpecName: "kube-api-access-cvqvz") pod "e147cdc2-fabb-407d-992a-d4a654c09fa2" (UID: "e147cdc2-fabb-407d-992a-d4a654c09fa2"). InnerVolumeSpecName "kube-api-access-cvqvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.156114 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-util" (OuterVolumeSpecName: "util") pod "e147cdc2-fabb-407d-992a-d4a654c09fa2" (UID: "e147cdc2-fabb-407d-992a-d4a654c09fa2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.241851 4789 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.241901 4789 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e147cdc2-fabb-407d-992a-d4a654c09fa2-util\") on node \"crc\" DevicePath \"\"" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.241927 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvqvz\" (UniqueName: \"kubernetes.io/projected/e147cdc2-fabb-407d-992a-d4a654c09fa2-kube-api-access-cvqvz\") on node \"crc\" DevicePath \"\"" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.843793 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" event={"ID":"e147cdc2-fabb-407d-992a-d4a654c09fa2","Type":"ContainerDied","Data":"5751d8afbf7458223e5aa21180ce6ef0134547e52a65551f584da1d61996ed38"} Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.843843 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5751d8afbf7458223e5aa21180ce6ef0134547e52a65551f584da1d61996ed38" Dec 16 07:08:24 crc kubenswrapper[4789]: I1216 07:08:24.843864 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.043856 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf"] Dec 16 07:08:29 crc kubenswrapper[4789]: E1216 07:08:29.044507 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerName="extract" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.044524 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerName="extract" Dec 16 07:08:29 crc kubenswrapper[4789]: E1216 07:08:29.044540 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerName="pull" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.044548 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerName="pull" Dec 16 07:08:29 crc kubenswrapper[4789]: E1216 07:08:29.044573 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerName="util" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.044583 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerName="util" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.044728 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e147cdc2-fabb-407d-992a-d4a654c09fa2" containerName="extract" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.045270 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.047846 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-s5nh4" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.078308 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf"] Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.105698 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5rvw\" (UniqueName: \"kubernetes.io/projected/c1f222bf-e01e-4bd6-a12a-15b726f8bb85-kube-api-access-r5rvw\") pod \"openstack-operator-controller-operator-69fc74c8bb-hghpf\" (UID: \"c1f222bf-e01e-4bd6-a12a-15b726f8bb85\") " pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.207196 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5rvw\" (UniqueName: \"kubernetes.io/projected/c1f222bf-e01e-4bd6-a12a-15b726f8bb85-kube-api-access-r5rvw\") pod \"openstack-operator-controller-operator-69fc74c8bb-hghpf\" (UID: \"c1f222bf-e01e-4bd6-a12a-15b726f8bb85\") " pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.226998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5rvw\" (UniqueName: \"kubernetes.io/projected/c1f222bf-e01e-4bd6-a12a-15b726f8bb85-kube-api-access-r5rvw\") pod \"openstack-operator-controller-operator-69fc74c8bb-hghpf\" (UID: \"c1f222bf-e01e-4bd6-a12a-15b726f8bb85\") " pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.363554 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.804832 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf"] Dec 16 07:08:29 crc kubenswrapper[4789]: I1216 07:08:29.870785 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" event={"ID":"c1f222bf-e01e-4bd6-a12a-15b726f8bb85","Type":"ContainerStarted","Data":"9ebe018f90632e1d760be1a46b70b228a3d7c7c5d521b2dff81a4a8bb52d5462"} Dec 16 07:08:34 crc kubenswrapper[4789]: I1216 07:08:34.906246 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" event={"ID":"c1f222bf-e01e-4bd6-a12a-15b726f8bb85","Type":"ContainerStarted","Data":"98521bcbb97260b9a3eb3ce3d026fd1945336de165395644be148cd407a624a5"} Dec 16 07:08:34 crc kubenswrapper[4789]: I1216 07:08:34.906746 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" Dec 16 07:08:34 crc kubenswrapper[4789]: I1216 07:08:34.935887 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" podStartSLOduration=1.668014678 podStartE2EDuration="5.935864795s" podCreationTimestamp="2025-12-16 07:08:29 +0000 UTC" firstStartedPulling="2025-12-16 07:08:29.812214885 +0000 UTC m=+1048.074102504" lastFinishedPulling="2025-12-16 07:08:34.080064992 +0000 UTC m=+1052.341952621" observedRunningTime="2025-12-16 07:08:34.931772855 +0000 UTC m=+1053.193660494" watchObservedRunningTime="2025-12-16 07:08:34.935864795 +0000 UTC m=+1053.197752434" Dec 16 07:08:39 crc kubenswrapper[4789]: I1216 07:08:39.366540 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-69fc74c8bb-hghpf" Dec 16 07:08:51 crc kubenswrapper[4789]: I1216 07:08:51.928228 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:08:51 crc kubenswrapper[4789]: I1216 07:08:51.929783 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:08:51 crc kubenswrapper[4789]: I1216 07:08:51.929965 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:08:51 crc kubenswrapper[4789]: I1216 07:08:51.930622 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5498247db061c67566479b4544d243bb1272801b3a301b0847cb7fdd1e323de"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:08:51 crc kubenswrapper[4789]: I1216 07:08:51.930759 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://b5498247db061c67566479b4544d243bb1272801b3a301b0847cb7fdd1e323de" gracePeriod=600 Dec 16 07:08:54 crc kubenswrapper[4789]: I1216 07:08:54.030353 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="b5498247db061c67566479b4544d243bb1272801b3a301b0847cb7fdd1e323de" exitCode=0 Dec 16 07:08:54 crc kubenswrapper[4789]: I1216 07:08:54.031083 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"b5498247db061c67566479b4544d243bb1272801b3a301b0847cb7fdd1e323de"} Dec 16 07:08:54 crc kubenswrapper[4789]: I1216 07:08:54.031136 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"a9c7a67d0b05df89259805e04a44c28da359f8954db5a37cfc842fbdb4aa2e7a"} Dec 16 07:08:54 crc kubenswrapper[4789]: I1216 07:08:54.031158 4789 scope.go:117] "RemoveContainer" containerID="3851c1899da6a57a194edd039ca6372a9930890332c280cff4f36f157e8d3272" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.426746 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-hlm8z"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.428235 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.430088 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.430513 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-82j8b" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.430724 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.435413 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-6z5d7" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.438812 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.439853 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.444064 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pp69v" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.452438 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-hlm8z"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.460554 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.469175 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.490574 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.491529 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.499360 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.500112 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.502142 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-fhpvj" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.508048 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8bzvm" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.511649 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrht\" (UniqueName: \"kubernetes.io/projected/f4d189a6-9923-41a3-be17-a18a76b9d382-kube-api-access-8mrht\") pod \"glance-operator-controller-manager-767f9d7567-bm8v5\" (UID: \"f4d189a6-9923-41a3-be17-a18a76b9d382\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.511719 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfhm4\" (UniqueName: \"kubernetes.io/projected/cc2943a0-fd8f-49bd-bf85-aa6fb274e999-kube-api-access-qfhm4\") pod \"cinder-operator-controller-manager-5f98b4754f-g2w7k\" (UID: \"cc2943a0-fd8f-49bd-bf85-aa6fb274e999\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.511742 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkkp\" (UniqueName: \"kubernetes.io/projected/0152085f-c1f6-478c-9044-749eb51fad39-kube-api-access-7mkkp\") pod \"designate-operator-controller-manager-66f8b87655-gkhql\" (UID: \"0152085f-c1f6-478c-9044-749eb51fad39\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.511783 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lclbb\" (UniqueName: \"kubernetes.io/projected/dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f-kube-api-access-lclbb\") pod \"heat-operator-controller-manager-59b8dcb766-2t6n8\" (UID: \"dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.511811 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnv4d\" (UniqueName: \"kubernetes.io/projected/6132cbf3-8a9f-4505-adcc-2e46beb5bf0e-kube-api-access-pnv4d\") pod \"barbican-operator-controller-manager-95949466-hlm8z\" (UID: \"6132cbf3-8a9f-4505-adcc-2e46beb5bf0e\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.518077 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.525648 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.538888 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.539901 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.552859 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7vs84" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.565760 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-j97qq"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.566640 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.566746 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.571013 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-j97qq"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.574704 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.578449 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-59wlp" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.580115 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.581054 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.584575 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7h7dw" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.602136 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.609808 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.610520 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.611847 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jk594" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613164 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfhm4\" (UniqueName: \"kubernetes.io/projected/cc2943a0-fd8f-49bd-bf85-aa6fb274e999-kube-api-access-qfhm4\") pod \"cinder-operator-controller-manager-5f98b4754f-g2w7k\" (UID: \"cc2943a0-fd8f-49bd-bf85-aa6fb274e999\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613193 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkkp\" (UniqueName: \"kubernetes.io/projected/0152085f-c1f6-478c-9044-749eb51fad39-kube-api-access-7mkkp\") pod \"designate-operator-controller-manager-66f8b87655-gkhql\" (UID: \"0152085f-c1f6-478c-9044-749eb51fad39\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613233 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25kg4\" (UniqueName: \"kubernetes.io/projected/10da721c-ec68-4b14-b65e-ebf283e4ba59-kube-api-access-25kg4\") pod \"ironic-operator-controller-manager-f458558d7-f46p9\" (UID: \"10da721c-ec68-4b14-b65e-ebf283e4ba59\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613253 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613272 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclbb\" (UniqueName: \"kubernetes.io/projected/dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f-kube-api-access-lclbb\") pod \"heat-operator-controller-manager-59b8dcb766-2t6n8\" (UID: \"dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613290 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24f6\" (UniqueName: \"kubernetes.io/projected/d30a0974-7667-4999-9c46-3970ad1a6a8b-kube-api-access-m24f6\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613309 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrjwp\" (UniqueName: \"kubernetes.io/projected/226147eb-5ae9-43a3-8d68-19115b510a2f-kube-api-access-vrjwp\") pod \"horizon-operator-controller-manager-6ccf486b9-8wxm6\" (UID: \"226147eb-5ae9-43a3-8d68-19115b510a2f\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613332 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnv4d\" (UniqueName: \"kubernetes.io/projected/6132cbf3-8a9f-4505-adcc-2e46beb5bf0e-kube-api-access-pnv4d\") pod \"barbican-operator-controller-manager-95949466-hlm8z\" (UID: \"6132cbf3-8a9f-4505-adcc-2e46beb5bf0e\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.613354 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mrht\" (UniqueName: \"kubernetes.io/projected/f4d189a6-9923-41a3-be17-a18a76b9d382-kube-api-access-8mrht\") pod \"glance-operator-controller-manager-767f9d7567-bm8v5\" (UID: \"f4d189a6-9923-41a3-be17-a18a76b9d382\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.636276 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.637102 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.640543 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7j8lb" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.646531 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfhm4\" (UniqueName: \"kubernetes.io/projected/cc2943a0-fd8f-49bd-bf85-aa6fb274e999-kube-api-access-qfhm4\") pod \"cinder-operator-controller-manager-5f98b4754f-g2w7k\" (UID: \"cc2943a0-fd8f-49bd-bf85-aa6fb274e999\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.649078 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclbb\" (UniqueName: \"kubernetes.io/projected/dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f-kube-api-access-lclbb\") pod \"heat-operator-controller-manager-59b8dcb766-2t6n8\" (UID: \"dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.651430 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.652520 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkkp\" (UniqueName: \"kubernetes.io/projected/0152085f-c1f6-478c-9044-749eb51fad39-kube-api-access-7mkkp\") pod \"designate-operator-controller-manager-66f8b87655-gkhql\" (UID: \"0152085f-c1f6-478c-9044-749eb51fad39\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.652681 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mrht\" (UniqueName: \"kubernetes.io/projected/f4d189a6-9923-41a3-be17-a18a76b9d382-kube-api-access-8mrht\") pod \"glance-operator-controller-manager-767f9d7567-bm8v5\" (UID: \"f4d189a6-9923-41a3-be17-a18a76b9d382\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.659807 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnv4d\" (UniqueName: \"kubernetes.io/projected/6132cbf3-8a9f-4505-adcc-2e46beb5bf0e-kube-api-access-pnv4d\") pod \"barbican-operator-controller-manager-95949466-hlm8z\" (UID: \"6132cbf3-8a9f-4505-adcc-2e46beb5bf0e\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.675593 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.676618 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.686847 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gt4df" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.704830 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.705826 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.707607 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-k6mgl" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.711063 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.712090 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.714809 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzwpt\" (UniqueName: \"kubernetes.io/projected/4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d-kube-api-access-hzwpt\") pod \"mariadb-operator-controller-manager-f76f4954c-jfxhw\" (UID: \"4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.714856 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25kg4\" (UniqueName: \"kubernetes.io/projected/10da721c-ec68-4b14-b65e-ebf283e4ba59-kube-api-access-25kg4\") pod \"ironic-operator-controller-manager-f458558d7-f46p9\" (UID: \"10da721c-ec68-4b14-b65e-ebf283e4ba59\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.714878 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.714901 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24f6\" (UniqueName: \"kubernetes.io/projected/d30a0974-7667-4999-9c46-3970ad1a6a8b-kube-api-access-m24f6\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.714940 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrjwp\" (UniqueName: \"kubernetes.io/projected/226147eb-5ae9-43a3-8d68-19115b510a2f-kube-api-access-vrjwp\") pod \"horizon-operator-controller-manager-6ccf486b9-8wxm6\" (UID: \"226147eb-5ae9-43a3-8d68-19115b510a2f\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.714973 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7687\" (UniqueName: \"kubernetes.io/projected/db9badf9-8fa3-484a-8ca4-ffa31c0c29c5-kube-api-access-x7687\") pod \"manila-operator-controller-manager-5fdd9786f7-p29cz\" (UID: \"db9badf9-8fa3-484a-8ca4-ffa31c0c29c5\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.714995 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c768b\" (UniqueName: \"kubernetes.io/projected/822ac1df-18a3-4440-bd77-507c589ff693-kube-api-access-c768b\") pod \"neutron-operator-controller-manager-7cd87b778f-72lbq\" (UID: \"822ac1df-18a3-4440-bd77-507c589ff693\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.716176 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmpsx\" (UniqueName: \"kubernetes.io/projected/5d5d9279-e35b-4b95-be8e-dc54a056e7b5-kube-api-access-wmpsx\") pod \"keystone-operator-controller-manager-5c7cbf548f-pxvzs\" (UID: \"5d5d9279-e35b-4b95-be8e-dc54a056e7b5\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" Dec 16 07:09:03 crc kubenswrapper[4789]: E1216 07:09:03.716933 4789 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:03 crc kubenswrapper[4789]: E1216 07:09:03.717002 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert podName:d30a0974-7667-4999-9c46-3970ad1a6a8b nodeName:}" failed. No retries permitted until 2025-12-16 07:09:04.216968746 +0000 UTC m=+1082.478856375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert") pod "infra-operator-controller-manager-84b495f78-j97qq" (UID: "d30a0974-7667-4999-9c46-3970ad1a6a8b") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.718349 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kzxnt" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.755612 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.760664 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24f6\" (UniqueName: \"kubernetes.io/projected/d30a0974-7667-4999-9c46-3970ad1a6a8b-kube-api-access-m24f6\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.770746 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.772134 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrjwp\" (UniqueName: \"kubernetes.io/projected/226147eb-5ae9-43a3-8d68-19115b510a2f-kube-api-access-vrjwp\") pod \"horizon-operator-controller-manager-6ccf486b9-8wxm6\" (UID: \"226147eb-5ae9-43a3-8d68-19115b510a2f\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.779410 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25kg4\" (UniqueName: \"kubernetes.io/projected/10da721c-ec68-4b14-b65e-ebf283e4ba59-kube-api-access-25kg4\") pod \"ironic-operator-controller-manager-f458558d7-f46p9\" (UID: \"10da721c-ec68-4b14-b65e-ebf283e4ba59\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.782036 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.788075 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.794931 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.811656 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.819164 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7687\" (UniqueName: \"kubernetes.io/projected/db9badf9-8fa3-484a-8ca4-ffa31c0c29c5-kube-api-access-x7687\") pod \"manila-operator-controller-manager-5fdd9786f7-p29cz\" (UID: \"db9badf9-8fa3-484a-8ca4-ffa31c0c29c5\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.819219 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c768b\" (UniqueName: \"kubernetes.io/projected/822ac1df-18a3-4440-bd77-507c589ff693-kube-api-access-c768b\") pod \"neutron-operator-controller-manager-7cd87b778f-72lbq\" (UID: \"822ac1df-18a3-4440-bd77-507c589ff693\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.819286 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmpsx\" (UniqueName: \"kubernetes.io/projected/5d5d9279-e35b-4b95-be8e-dc54a056e7b5-kube-api-access-wmpsx\") pod \"keystone-operator-controller-manager-5c7cbf548f-pxvzs\" (UID: \"5d5d9279-e35b-4b95-be8e-dc54a056e7b5\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.819334 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzwpt\" (UniqueName: \"kubernetes.io/projected/4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d-kube-api-access-hzwpt\") pod \"mariadb-operator-controller-manager-f76f4954c-jfxhw\" (UID: \"4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.819410 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gsn2\" (UniqueName: \"kubernetes.io/projected/c7268550-e5d4-4664-b04d-ecfa498cb475-kube-api-access-9gsn2\") pod \"nova-operator-controller-manager-5fbbf8b6cc-d6bcb\" (UID: \"c7268550-e5d4-4664-b04d-ecfa498cb475\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.827128 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.830972 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.838230 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzwpt\" (UniqueName: \"kubernetes.io/projected/4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d-kube-api-access-hzwpt\") pod \"mariadb-operator-controller-manager-f76f4954c-jfxhw\" (UID: \"4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.838951 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c768b\" (UniqueName: \"kubernetes.io/projected/822ac1df-18a3-4440-bd77-507c589ff693-kube-api-access-c768b\") pod \"neutron-operator-controller-manager-7cd87b778f-72lbq\" (UID: \"822ac1df-18a3-4440-bd77-507c589ff693\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.842339 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7687\" (UniqueName: \"kubernetes.io/projected/db9badf9-8fa3-484a-8ca4-ffa31c0c29c5-kube-api-access-x7687\") pod \"manila-operator-controller-manager-5fdd9786f7-p29cz\" (UID: \"db9badf9-8fa3-484a-8ca4-ffa31c0c29c5\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.844505 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.846674 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmpsx\" (UniqueName: \"kubernetes.io/projected/5d5d9279-e35b-4b95-be8e-dc54a056e7b5-kube-api-access-wmpsx\") pod \"keystone-operator-controller-manager-5c7cbf548f-pxvzs\" (UID: \"5d5d9279-e35b-4b95-be8e-dc54a056e7b5\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.852860 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.871011 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.882306 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.882543 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.886669 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.887996 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.891551 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-bc999" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.891787 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6ctbt" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.903978 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.908972 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.915148 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.915893 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.917496 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.917659 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ptrtc" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.920194 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gsn2\" (UniqueName: \"kubernetes.io/projected/c7268550-e5d4-4664-b04d-ecfa498cb475-kube-api-access-9gsn2\") pod \"nova-operator-controller-manager-5fbbf8b6cc-d6bcb\" (UID: \"c7268550-e5d4-4664-b04d-ecfa498cb475\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.920235 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lzjx\" (UniqueName: \"kubernetes.io/projected/826f108e-bfd8-43bb-8719-d9a569778578-kube-api-access-5lzjx\") pod \"ovn-operator-controller-manager-bf6d4f946-6ztbz\" (UID: \"826f108e-bfd8-43bb-8719-d9a569778578\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.920270 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgr7\" (UniqueName: \"kubernetes.io/projected/d03193ca-0584-4d77-bab4-5e42abf5b5b5-kube-api-access-tdgr7\") pod \"octavia-operator-controller-manager-68c649d9d-d99lr\" (UID: \"d03193ca-0584-4d77-bab4-5e42abf5b5b5\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.924876 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.926660 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.928178 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rhm84" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.938192 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.962460 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.963660 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.968346 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-prb2t" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.968773 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gsn2\" (UniqueName: \"kubernetes.io/projected/c7268550-e5d4-4664-b04d-ecfa498cb475-kube-api-access-9gsn2\") pod \"nova-operator-controller-manager-5fbbf8b6cc-d6bcb\" (UID: \"c7268550-e5d4-4664-b04d-ecfa498cb475\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.973303 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.981024 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.986322 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr"] Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.987081 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" Dec 16 07:09:03 crc kubenswrapper[4789]: I1216 07:09:03.990412 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xkd6j" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.018047 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.021203 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtqx4\" (UniqueName: \"kubernetes.io/projected/1950613d-02b6-4c9f-925a-e3ece57069ed-kube-api-access-xtqx4\") pod \"swift-operator-controller-manager-5c6df8f9-8dfvw\" (UID: \"1950613d-02b6-4c9f-925a-e3ece57069ed\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.021233 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lzjx\" (UniqueName: \"kubernetes.io/projected/826f108e-bfd8-43bb-8719-d9a569778578-kube-api-access-5lzjx\") pod \"ovn-operator-controller-manager-bf6d4f946-6ztbz\" (UID: \"826f108e-bfd8-43bb-8719-d9a569778578\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.021259 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgr7\" (UniqueName: \"kubernetes.io/projected/d03193ca-0584-4d77-bab4-5e42abf5b5b5-kube-api-access-tdgr7\") pod \"octavia-operator-controller-manager-68c649d9d-d99lr\" (UID: \"d03193ca-0584-4d77-bab4-5e42abf5b5b5\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.021282 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsnlm\" (UniqueName: \"kubernetes.io/projected/a9aa6ddb-befe-472b-bbaf-c17285d7ade4-kube-api-access-lsnlm\") pod \"telemetry-operator-controller-manager-97d456b9-f6ndr\" (UID: \"a9aa6ddb-befe-472b-bbaf-c17285d7ade4\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.021327 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdmm\" (UniqueName: \"kubernetes.io/projected/5db5b7f8-cc13-42b5-9c72-87bf990091d2-kube-api-access-sfdmm\") pod \"placement-operator-controller-manager-8665b56d78-b6v9v\" (UID: \"5db5b7f8-cc13-42b5-9c72-87bf990091d2\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.021344 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286kn\" (UniqueName: \"kubernetes.io/projected/a08c1d95-200f-40ce-abef-dbb505570602-kube-api-access-286kn\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.021361 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.024948 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.038240 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.039065 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.044241 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jvhpp" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.046564 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lzjx\" (UniqueName: \"kubernetes.io/projected/826f108e-bfd8-43bb-8719-d9a569778578-kube-api-access-5lzjx\") pod \"ovn-operator-controller-manager-bf6d4f946-6ztbz\" (UID: \"826f108e-bfd8-43bb-8719-d9a569778578\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.046800 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.048963 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgr7\" (UniqueName: \"kubernetes.io/projected/d03193ca-0584-4d77-bab4-5e42abf5b5b5-kube-api-access-tdgr7\") pod \"octavia-operator-controller-manager-68c649d9d-d99lr\" (UID: \"d03193ca-0584-4d77-bab4-5e42abf5b5b5\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.052425 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.103259 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.104137 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.108873 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wwk2r" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.120171 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.122305 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfdmm\" (UniqueName: \"kubernetes.io/projected/5db5b7f8-cc13-42b5-9c72-87bf990091d2-kube-api-access-sfdmm\") pod \"placement-operator-controller-manager-8665b56d78-b6v9v\" (UID: \"5db5b7f8-cc13-42b5-9c72-87bf990091d2\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.122339 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-286kn\" (UniqueName: \"kubernetes.io/projected/a08c1d95-200f-40ce-abef-dbb505570602-kube-api-access-286kn\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.122357 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.122422 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdst\" (UniqueName: \"kubernetes.io/projected/07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0-kube-api-access-7vdst\") pod \"watcher-operator-controller-manager-55f78b7c4c-4xsfm\" (UID: \"07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.122444 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtqx4\" (UniqueName: \"kubernetes.io/projected/1950613d-02b6-4c9f-925a-e3ece57069ed-kube-api-access-xtqx4\") pod \"swift-operator-controller-manager-5c6df8f9-8dfvw\" (UID: \"1950613d-02b6-4c9f-925a-e3ece57069ed\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.122481 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46z5d\" (UniqueName: \"kubernetes.io/projected/8fce40f9-3595-4e54-816f-9e567e87ef4b-kube-api-access-46z5d\") pod \"test-operator-controller-manager-756ccf86c7-8vdpj\" (UID: \"8fce40f9-3595-4e54-816f-9e567e87ef4b\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.122509 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsnlm\" (UniqueName: \"kubernetes.io/projected/a9aa6ddb-befe-472b-bbaf-c17285d7ade4-kube-api-access-lsnlm\") pod \"telemetry-operator-controller-manager-97d456b9-f6ndr\" (UID: \"a9aa6ddb-befe-472b-bbaf-c17285d7ade4\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.123500 4789 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.123543 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert podName:a08c1d95-200f-40ce-abef-dbb505570602 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:04.623530341 +0000 UTC m=+1082.885417970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6br444w" (UID: "a08c1d95-200f-40ce-abef-dbb505570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.134625 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.156376 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.156960 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtqx4\" (UniqueName: \"kubernetes.io/projected/1950613d-02b6-4c9f-925a-e3ece57069ed-kube-api-access-xtqx4\") pod \"swift-operator-controller-manager-5c6df8f9-8dfvw\" (UID: \"1950613d-02b6-4c9f-925a-e3ece57069ed\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.165079 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.166611 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.167456 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.170797 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.171099 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6psqg" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.176773 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.209539 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.214964 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsnlm\" (UniqueName: \"kubernetes.io/projected/a9aa6ddb-befe-472b-bbaf-c17285d7ade4-kube-api-access-lsnlm\") pod \"telemetry-operator-controller-manager-97d456b9-f6ndr\" (UID: \"a9aa6ddb-befe-472b-bbaf-c17285d7ade4\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.215539 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfdmm\" (UniqueName: \"kubernetes.io/projected/5db5b7f8-cc13-42b5-9c72-87bf990091d2-kube-api-access-sfdmm\") pod \"placement-operator-controller-manager-8665b56d78-b6v9v\" (UID: \"5db5b7f8-cc13-42b5-9c72-87bf990091d2\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.216510 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-286kn\" (UniqueName: \"kubernetes.io/projected/a08c1d95-200f-40ce-abef-dbb505570602-kube-api-access-286kn\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.223383 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.223533 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.223705 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdst\" (UniqueName: \"kubernetes.io/projected/07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0-kube-api-access-7vdst\") pod \"watcher-operator-controller-manager-55f78b7c4c-4xsfm\" (UID: \"07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.223877 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46z5d\" (UniqueName: \"kubernetes.io/projected/8fce40f9-3595-4e54-816f-9e567e87ef4b-kube-api-access-46z5d\") pod \"test-operator-controller-manager-756ccf86c7-8vdpj\" (UID: \"8fce40f9-3595-4e54-816f-9e567e87ef4b\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.223977 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lktbk\" (UniqueName: \"kubernetes.io/projected/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-kube-api-access-lktbk\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.224044 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.224373 4789 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.224424 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert podName:d30a0974-7667-4999-9c46-3970ad1a6a8b nodeName:}" failed. No retries permitted until 2025-12-16 07:09:05.224405947 +0000 UTC m=+1083.486293576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert") pod "infra-operator-controller-manager-84b495f78-j97qq" (UID: "d30a0974-7667-4999-9c46-3970ad1a6a8b") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.225162 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.244466 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46z5d\" (UniqueName: \"kubernetes.io/projected/8fce40f9-3595-4e54-816f-9e567e87ef4b-kube-api-access-46z5d\") pod \"test-operator-controller-manager-756ccf86c7-8vdpj\" (UID: \"8fce40f9-3595-4e54-816f-9e567e87ef4b\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.251787 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdst\" (UniqueName: \"kubernetes.io/projected/07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0-kube-api-access-7vdst\") pod \"watcher-operator-controller-manager-55f78b7c4c-4xsfm\" (UID: \"07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.253254 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.258482 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.262322 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.264871 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vmk4l" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.268663 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.273363 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.321619 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.329113 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lktbk\" (UniqueName: \"kubernetes.io/projected/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-kube-api-access-lktbk\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.329183 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.329280 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.329659 4789 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.329743 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:04.829726431 +0000 UTC m=+1083.091614060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.330777 4789 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.330820 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:04.830810278 +0000 UTC m=+1083.092697907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "metrics-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.330880 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fzn\" (UniqueName: \"kubernetes.io/projected/26befb39-90f5-4fa1-8f8a-3b82ebae6472-kube-api-access-w4fzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k7fwx\" (UID: \"26befb39-90f5-4fa1-8f8a-3b82ebae6472\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.340723 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.371431 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.374483 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-hlm8z"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.375924 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lktbk\" (UniqueName: \"kubernetes.io/projected/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-kube-api-access-lktbk\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.385935 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k"] Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.434893 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fzn\" (UniqueName: \"kubernetes.io/projected/26befb39-90f5-4fa1-8f8a-3b82ebae6472-kube-api-access-w4fzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k7fwx\" (UID: \"26befb39-90f5-4fa1-8f8a-3b82ebae6472\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.458380 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.462311 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fzn\" (UniqueName: \"kubernetes.io/projected/26befb39-90f5-4fa1-8f8a-3b82ebae6472-kube-api-access-w4fzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k7fwx\" (UID: \"26befb39-90f5-4fa1-8f8a-3b82ebae6472\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" Dec 16 07:09:04 crc kubenswrapper[4789]: W1216 07:09:04.469361 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6132cbf3_8a9f_4505_adcc_2e46beb5bf0e.slice/crio-f44fae09f8c02e8a179cdd655b2b5364bca63250c5c46679a0506768b15483f1 WatchSource:0}: Error finding container f44fae09f8c02e8a179cdd655b2b5364bca63250c5c46679a0506768b15483f1: Status 404 returned error can't find the container with id f44fae09f8c02e8a179cdd655b2b5364bca63250c5c46679a0506768b15483f1 Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.636954 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.637187 4789 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.637239 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert podName:a08c1d95-200f-40ce-abef-dbb505570602 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:05.637225285 +0000 UTC m=+1083.899112914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6br444w" (UID: "a08c1d95-200f-40ce-abef-dbb505570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.718191 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.841972 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: I1216 07:09:04.842075 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.842077 4789 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.842146 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:05.842130303 +0000 UTC m=+1084.104017932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "webhook-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.842222 4789 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:09:04 crc kubenswrapper[4789]: E1216 07:09:04.842273 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:05.842257936 +0000 UTC m=+1084.104145615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "metrics-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.090214 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql"] Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.102733 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0152085f_c1f6_478c_9044_749eb51fad39.slice/crio-93d873ce3e46babb655c6bfc8f33e544234aced90d30287484c337f911973350 WatchSource:0}: Error finding container 93d873ce3e46babb655c6bfc8f33e544234aced90d30287484c337f911973350: Status 404 returned error can't find the container with id 93d873ce3e46babb655c6bfc8f33e544234aced90d30287484c337f911973350 Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.191193 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" event={"ID":"cc2943a0-fd8f-49bd-bf85-aa6fb274e999","Type":"ContainerStarted","Data":"78ee4f6bfd424a28a34ef08c49888adeb35b6a6b52d0c07bcfe32320a9e17e0d"} Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.198473 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" event={"ID":"6132cbf3-8a9f-4505-adcc-2e46beb5bf0e","Type":"ContainerStarted","Data":"f44fae09f8c02e8a179cdd655b2b5364bca63250c5c46679a0506768b15483f1"} Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.203638 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" event={"ID":"0152085f-c1f6-478c-9044-749eb51fad39","Type":"ContainerStarted","Data":"93d873ce3e46babb655c6bfc8f33e544234aced90d30287484c337f911973350"} Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.249699 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.249888 4789 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.249977 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert podName:d30a0974-7667-4999-9c46-3970ad1a6a8b nodeName:}" failed. No retries permitted until 2025-12-16 07:09:07.249960549 +0000 UTC m=+1085.511848178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert") pod "infra-operator-controller-manager-84b495f78-j97qq" (UID: "d30a0974-7667-4999-9c46-3970ad1a6a8b") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.300777 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.316070 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.331028 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8"] Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.338637 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822ac1df_18a3_4440_bd77_507c589ff693.slice/crio-29969afc34fc199abef706a24a4c9414891a895fda34f146c4b16b4aeb2357f2 WatchSource:0}: Error finding container 29969afc34fc199abef706a24a4c9414891a895fda34f146c4b16b4aeb2357f2: Status 404 returned error can't find the container with id 29969afc34fc199abef706a24a4c9414891a895fda34f146c4b16b4aeb2357f2 Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.344547 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d189a6_9923_41a3_be17_a18a76b9d382.slice/crio-9d3902e00e31d38ddc16502a387e87bd1ce52bddddc0bbf02133e79c8ef91122 WatchSource:0}: Error finding container 9d3902e00e31d38ddc16502a387e87bd1ce52bddddc0bbf02133e79c8ef91122: Status 404 returned error can't find the container with id 9d3902e00e31d38ddc16502a387e87bd1ce52bddddc0bbf02133e79c8ef91122 Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.360222 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.643890 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.660398 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.660596 4789 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.660669 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert podName:a08c1d95-200f-40ce-abef-dbb505570602 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:07.660652526 +0000 UTC m=+1085.922540155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6br444w" (UID: "a08c1d95-200f-40ce-abef-dbb505570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.670424 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz"] Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.672092 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod826f108e_bfd8_43bb_8719_d9a569778578.slice/crio-f10f6bad1bcd1a7c55f637c2ceb19aa5fbe1bbe701253f507bd16edb642b7bcf WatchSource:0}: Error finding container f10f6bad1bcd1a7c55f637c2ceb19aa5fbe1bbe701253f507bd16edb642b7bcf: Status 404 returned error can't find the container with id f10f6bad1bcd1a7c55f637c2ceb19aa5fbe1bbe701253f507bd16edb642b7bcf Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.673334 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d5d9279_e35b_4b95_be8e_dc54a056e7b5.slice/crio-ee1f1ed378bd5a29b89a7c0f32de66bc0e3df57296693bd541f4fdf8a3414ed8 WatchSource:0}: Error finding container ee1f1ed378bd5a29b89a7c0f32de66bc0e3df57296693bd541f4fdf8a3414ed8: Status 404 returned error can't find the container with id ee1f1ed378bd5a29b89a7c0f32de66bc0e3df57296693bd541f4fdf8a3414ed8 Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.678998 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd03193ca_0584_4d77_bab4_5e42abf5b5b5.slice/crio-708564a4d5fb565f309c1c63036610b59cd9096c3a3dfccbc37f1938502b754b WatchSource:0}: Error finding container 708564a4d5fb565f309c1c63036610b59cd9096c3a3dfccbc37f1938502b754b: Status 404 returned error can't find the container with id 708564a4d5fb565f309c1c63036610b59cd9096c3a3dfccbc37f1938502b754b Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.679104 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.684695 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.698953 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.704613 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx"] Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.710153 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9aa6ddb_befe_472b_bbaf_c17285d7ade4.slice/crio-dfa2f4345f2f70ed3d0cbaae92772c11cebfcb9d007e7f8b0759c21e940d11fc WatchSource:0}: Error finding container dfa2f4345f2f70ed3d0cbaae92772c11cebfcb9d007e7f8b0759c21e940d11fc: Status 404 returned error can't find the container with id dfa2f4345f2f70ed3d0cbaae92772c11cebfcb9d007e7f8b0759c21e940d11fc Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.711799 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.717867 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9"] Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.718421 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10da721c_ec68_4b14_b65e_ebf283e4ba59.slice/crio-2bc64c339f2517355f52ea5fe972c64df999dc135550b03ffa7a6dcb693e1926 WatchSource:0}: Error finding container 2bc64c339f2517355f52ea5fe972c64df999dc135550b03ffa7a6dcb693e1926: Status 404 returned error can't find the container with id 2bc64c339f2517355f52ea5fe972c64df999dc135550b03ffa7a6dcb693e1926 Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.722496 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-25kg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-f458558d7-f46p9_openstack-operators(10da721c-ec68-4b14-b65e-ebf283e4ba59): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.726180 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" podUID="10da721c-ec68-4b14-b65e-ebf283e4ba59" Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.726244 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw"] Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.727843 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e02dba2_7cf2_4cbd_a2f2_b91ddbec517d.slice/crio-8cf3359806413456dcc15d2662b1648231effbe46e8bf7cde8eeef29d3f1ce99 WatchSource:0}: Error finding container 8cf3359806413456dcc15d2662b1648231effbe46e8bf7cde8eeef29d3f1ce99: Status 404 returned error can't find the container with id 8cf3359806413456dcc15d2662b1648231effbe46e8bf7cde8eeef29d3f1ce99 Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.732113 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj"] Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.736253 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26befb39_90f5_4fa1_8f8a_3b82ebae6472.slice/crio-e5f6817de104d01180de470609000c79f0aeaa51ee2aaef10108ee8130b0ad75 WatchSource:0}: Error finding container e5f6817de104d01180de470609000c79f0aeaa51ee2aaef10108ee8130b0ad75: Status 404 returned error can't find the container with id e5f6817de104d01180de470609000c79f0aeaa51ee2aaef10108ee8130b0ad75 Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.737504 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw"] Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.739718 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hzwpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-f76f4954c-jfxhw_openstack-operators(4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.740806 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" podUID="4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d" Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.742273 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm"] Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.746930 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz"] Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.758248 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7vdst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-55f78b7c4c-4xsfm_openstack-operators(07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.759383 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" podUID="07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.760228 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x7687,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5fdd9786f7-p29cz_openstack-operators(db9badf9-8fa3-484a-8ca4-ffa31c0c29c5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.761359 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" podUID="db9badf9-8fa3-484a-8ca4-ffa31c0c29c5" Dec 16 07:09:05 crc kubenswrapper[4789]: W1216 07:09:05.761892 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1950613d_02b6_4c9f_925a_e3ece57069ed.slice/crio-91a7b0062ba4271688e98f14bf77c9e8679bfef2bec6f76582e8b74e8b4ab5c7 WatchSource:0}: Error finding container 91a7b0062ba4271688e98f14bf77c9e8679bfef2bec6f76582e8b74e8b4ab5c7: Status 404 returned error can't find the container with id 91a7b0062ba4271688e98f14bf77c9e8679bfef2bec6f76582e8b74e8b4ab5c7 Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.764424 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xtqx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5c6df8f9-8dfvw_openstack-operators(1950613d-02b6-4c9f-925a-e3ece57069ed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.766588 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" podUID="1950613d-02b6-4c9f-925a-e3ece57069ed" Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.863972 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.864189 4789 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.864363 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:07.864345064 +0000 UTC m=+1086.126232683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "webhook-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: I1216 07:09:05.864486 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.864667 4789 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:09:05 crc kubenswrapper[4789]: E1216 07:09:05.864758 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:07.864736113 +0000 UTC m=+1086.126623812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "metrics-server-cert" not found Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.211292 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" event={"ID":"c7268550-e5d4-4664-b04d-ecfa498cb475","Type":"ContainerStarted","Data":"ac6195ebd84c467c551bce854d1c2a16f247c0a92e1565fc911ef9c357c20b99"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.212554 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" event={"ID":"822ac1df-18a3-4440-bd77-507c589ff693","Type":"ContainerStarted","Data":"29969afc34fc199abef706a24a4c9414891a895fda34f146c4b16b4aeb2357f2"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.213957 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" event={"ID":"1950613d-02b6-4c9f-925a-e3ece57069ed","Type":"ContainerStarted","Data":"91a7b0062ba4271688e98f14bf77c9e8679bfef2bec6f76582e8b74e8b4ab5c7"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.215059 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" event={"ID":"5d5d9279-e35b-4b95-be8e-dc54a056e7b5","Type":"ContainerStarted","Data":"ee1f1ed378bd5a29b89a7c0f32de66bc0e3df57296693bd541f4fdf8a3414ed8"} Dec 16 07:09:06 crc kubenswrapper[4789]: E1216 07:09:06.215365 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" podUID="1950613d-02b6-4c9f-925a-e3ece57069ed" Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.216613 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" event={"ID":"10da721c-ec68-4b14-b65e-ebf283e4ba59","Type":"ContainerStarted","Data":"2bc64c339f2517355f52ea5fe972c64df999dc135550b03ffa7a6dcb693e1926"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.218601 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" event={"ID":"db9badf9-8fa3-484a-8ca4-ffa31c0c29c5","Type":"ContainerStarted","Data":"5c7e2409e106f1937dbb3ad281cc08d3f56c0c909e2f2caeb2d4aa45d9ca695a"} Dec 16 07:09:06 crc kubenswrapper[4789]: E1216 07:09:06.219608 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" podUID="10da721c-ec68-4b14-b65e-ebf283e4ba59" Dec 16 07:09:06 crc kubenswrapper[4789]: E1216 07:09:06.220115 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" podUID="db9badf9-8fa3-484a-8ca4-ffa31c0c29c5" Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.220424 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" event={"ID":"07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0","Type":"ContainerStarted","Data":"1000e5454bd3ea0854b5bdbefe6fe8090da4292688e210ce78c55e8387d13e6b"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.222751 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" event={"ID":"226147eb-5ae9-43a3-8d68-19115b510a2f","Type":"ContainerStarted","Data":"283576fc1fe5e68a56588997061e60434c12821200d1f5047c6c672b82a8db48"} Dec 16 07:09:06 crc kubenswrapper[4789]: E1216 07:09:06.224503 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" podUID="07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0" Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.236869 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" event={"ID":"d03193ca-0584-4d77-bab4-5e42abf5b5b5","Type":"ContainerStarted","Data":"708564a4d5fb565f309c1c63036610b59cd9096c3a3dfccbc37f1938502b754b"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.239477 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" event={"ID":"4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d","Type":"ContainerStarted","Data":"8cf3359806413456dcc15d2662b1648231effbe46e8bf7cde8eeef29d3f1ce99"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.241960 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" event={"ID":"8fce40f9-3595-4e54-816f-9e567e87ef4b","Type":"ContainerStarted","Data":"f63f307a540c791af48a76c94fc1e68ebd3ded90d1069a0e31acce9a8cd6ad96"} Dec 16 07:09:06 crc kubenswrapper[4789]: E1216 07:09:06.244122 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" podUID="4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d" Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.247235 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" event={"ID":"a9aa6ddb-befe-472b-bbaf-c17285d7ade4","Type":"ContainerStarted","Data":"dfa2f4345f2f70ed3d0cbaae92772c11cebfcb9d007e7f8b0759c21e940d11fc"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.249400 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" event={"ID":"826f108e-bfd8-43bb-8719-d9a569778578","Type":"ContainerStarted","Data":"f10f6bad1bcd1a7c55f637c2ceb19aa5fbe1bbe701253f507bd16edb642b7bcf"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.251959 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" event={"ID":"26befb39-90f5-4fa1-8f8a-3b82ebae6472","Type":"ContainerStarted","Data":"e5f6817de104d01180de470609000c79f0aeaa51ee2aaef10108ee8130b0ad75"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.253476 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" event={"ID":"f4d189a6-9923-41a3-be17-a18a76b9d382","Type":"ContainerStarted","Data":"9d3902e00e31d38ddc16502a387e87bd1ce52bddddc0bbf02133e79c8ef91122"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.255162 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" event={"ID":"5db5b7f8-cc13-42b5-9c72-87bf990091d2","Type":"ContainerStarted","Data":"ac427c33a29119d67fd1cfc8ee8f77b1f7978432ababae56652889e8aba1c8e8"} Dec 16 07:09:06 crc kubenswrapper[4789]: I1216 07:09:06.270184 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" event={"ID":"dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f","Type":"ContainerStarted","Data":"cea44e8a3f2c724768fdd7120dbdbbe5be9b440b97f73372c2d89de3057a403c"} Dec 16 07:09:07 crc kubenswrapper[4789]: I1216 07:09:07.287014 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.287242 4789 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.287310 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert podName:d30a0974-7667-4999-9c46-3970ad1a6a8b nodeName:}" failed. No retries permitted until 2025-12-16 07:09:11.287290167 +0000 UTC m=+1089.549177796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert") pod "infra-operator-controller-manager-84b495f78-j97qq" (UID: "d30a0974-7667-4999-9c46-3970ad1a6a8b") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.291787 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" podUID="1950613d-02b6-4c9f-925a-e3ece57069ed" Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.292084 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" podUID="10da721c-ec68-4b14-b65e-ebf283e4ba59" Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.292431 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" podUID="4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d" Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.293024 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" podUID="07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0" Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.294653 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" podUID="db9badf9-8fa3-484a-8ca4-ffa31c0c29c5" Dec 16 07:09:07 crc kubenswrapper[4789]: I1216 07:09:07.693863 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.694167 4789 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.694224 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert podName:a08c1d95-200f-40ce-abef-dbb505570602 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:11.694204342 +0000 UTC m=+1089.956091971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6br444w" (UID: "a08c1d95-200f-40ce-abef-dbb505570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:07 crc kubenswrapper[4789]: I1216 07:09:07.897675 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:07 crc kubenswrapper[4789]: I1216 07:09:07.897789 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.897811 4789 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.897882 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:11.897866079 +0000 UTC m=+1090.159753708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "webhook-server-cert" not found Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.897972 4789 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:09:07 crc kubenswrapper[4789]: E1216 07:09:07.898047 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:11.898029142 +0000 UTC m=+1090.159916831 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "metrics-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: I1216 07:09:11.402981 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.403138 4789 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.403489 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert podName:d30a0974-7667-4999-9c46-3970ad1a6a8b nodeName:}" failed. No retries permitted until 2025-12-16 07:09:19.403455658 +0000 UTC m=+1097.665343287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert") pod "infra-operator-controller-manager-84b495f78-j97qq" (UID: "d30a0974-7667-4999-9c46-3970ad1a6a8b") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: I1216 07:09:11.707404 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.707519 4789 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.707582 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert podName:a08c1d95-200f-40ce-abef-dbb505570602 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:19.70756533 +0000 UTC m=+1097.969452959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6br444w" (UID: "a08c1d95-200f-40ce-abef-dbb505570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: I1216 07:09:11.910023 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:11 crc kubenswrapper[4789]: I1216 07:09:11.910143 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.910222 4789 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.910332 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:19.910292944 +0000 UTC m=+1098.172180643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "webhook-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.910362 4789 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:09:11 crc kubenswrapper[4789]: E1216 07:09:11.910442 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:19.910423497 +0000 UTC m=+1098.172311126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "metrics-server-cert" not found Dec 16 07:09:18 crc kubenswrapper[4789]: E1216 07:09:18.530433 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f" Dec 16 07:09:18 crc kubenswrapper[4789]: E1216 07:09:18.531125 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lsnlm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-97d456b9-f6ndr_openstack-operators(a9aa6ddb-befe-472b-bbaf-c17285d7ade4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:09:18 crc kubenswrapper[4789]: E1216 07:09:18.532321 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" podUID="a9aa6ddb-befe-472b-bbaf-c17285d7ade4" Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.395001 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" podUID="a9aa6ddb-befe-472b-bbaf-c17285d7ade4" Dec 16 07:09:19 crc kubenswrapper[4789]: I1216 07:09:19.430305 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.430469 4789 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.430531 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert podName:d30a0974-7667-4999-9c46-3970ad1a6a8b nodeName:}" failed. No retries permitted until 2025-12-16 07:09:35.430512853 +0000 UTC m=+1113.692400492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert") pod "infra-operator-controller-manager-84b495f78-j97qq" (UID: "d30a0974-7667-4999-9c46-3970ad1a6a8b") : secret "infra-operator-webhook-server-cert" not found Dec 16 07:09:19 crc kubenswrapper[4789]: I1216 07:09:19.735725 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.735858 4789 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.735953 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert podName:a08c1d95-200f-40ce-abef-dbb505570602 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:35.735934176 +0000 UTC m=+1113.997821825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert") pod "openstack-baremetal-operator-controller-manager-66fff4bf6br444w" (UID: "a08c1d95-200f-40ce-abef-dbb505570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 07:09:19 crc kubenswrapper[4789]: I1216 07:09:19.939002 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.939117 4789 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 07:09:19 crc kubenswrapper[4789]: I1216 07:09:19.939135 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.939169 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:35.939154963 +0000 UTC m=+1114.201042592 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "metrics-server-cert" not found Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.939295 4789 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 07:09:19 crc kubenswrapper[4789]: E1216 07:09:19.939389 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs podName:467a702b-f3c6-42ef-ba9f-ec19e7d2a291 nodeName:}" failed. No retries permitted until 2025-12-16 07:09:35.939367578 +0000 UTC m=+1114.201255277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs") pod "openstack-operator-controller-manager-678747d7fb-swjqj" (UID: "467a702b-f3c6-42ef-ba9f-ec19e7d2a291") : secret "webhook-server-cert" not found Dec 16 07:09:26 crc kubenswrapper[4789]: E1216 07:09:26.036007 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 16 07:09:26 crc kubenswrapper[4789]: E1216 07:09:26.036760 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5lzjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bf6d4f946-6ztbz_openstack-operators(826f108e-bfd8-43bb-8719-d9a569778578): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:09:26 crc kubenswrapper[4789]: E1216 07:09:26.037983 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" podUID="826f108e-bfd8-43bb-8719-d9a569778578" Dec 16 07:09:26 crc kubenswrapper[4789]: E1216 07:09:26.441319 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" podUID="826f108e-bfd8-43bb-8719-d9a569778578" Dec 16 07:09:26 crc kubenswrapper[4789]: E1216 07:09:26.735630 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 16 07:09:26 crc kubenswrapper[4789]: E1216 07:09:26.735844 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wmpsx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-5c7cbf548f-pxvzs_openstack-operators(5d5d9279-e35b-4b95-be8e-dc54a056e7b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:09:26 crc kubenswrapper[4789]: E1216 07:09:26.742068 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" podUID="5d5d9279-e35b-4b95-be8e-dc54a056e7b5" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.303155 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.303567 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdgr7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-68c649d9d-d99lr_openstack-operators(d03193ca-0584-4d77-bab4-5e42abf5b5b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.304745 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" podUID="d03193ca-0584-4d77-bab4-5e42abf5b5b5" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.445487 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" podUID="d03193ca-0584-4d77-bab4-5e42abf5b5b5" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.446028 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" podUID="5d5d9279-e35b-4b95-be8e-dc54a056e7b5" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.787652 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.787856 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qfhm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5f98b4754f-g2w7k_openstack-operators(cc2943a0-fd8f-49bd-bf85-aa6fb274e999): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:09:27 crc kubenswrapper[4789]: E1216 07:09:27.789208 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" podUID="cc2943a0-fd8f-49bd-bf85-aa6fb274e999" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.247156 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.247309 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9gsn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-d6bcb_openstack-operators(c7268550-e5d4-4664-b04d-ecfa498cb475): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.249488 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" podUID="c7268550-e5d4-4664-b04d-ecfa498cb475" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.460193 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" podUID="c7268550-e5d4-4664-b04d-ecfa498cb475" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.460220 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" podUID="cc2943a0-fd8f-49bd-bf85-aa6fb274e999" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.714604 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.714756 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w4fzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-k7fwx_openstack-operators(26befb39-90f5-4fa1-8f8a-3b82ebae6472): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:09:28 crc kubenswrapper[4789]: E1216 07:09:28.715883 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" podUID="26befb39-90f5-4fa1-8f8a-3b82ebae6472" Dec 16 07:09:29 crc kubenswrapper[4789]: E1216 07:09:29.467875 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" podUID="26befb39-90f5-4fa1-8f8a-3b82ebae6472" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.492766 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" event={"ID":"07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0","Type":"ContainerStarted","Data":"8a77af64f347a092177bf072b3438bccf2b6257dae6d8e88fbb4877429a27cf9"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.494293 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.494337 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" event={"ID":"1950613d-02b6-4c9f-925a-e3ece57069ed","Type":"ContainerStarted","Data":"b3235b75ec5ac5e779374e049b9901db82723148f6d64f52bc9d691a05d7cb8a"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.494561 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.495533 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" event={"ID":"10da721c-ec68-4b14-b65e-ebf283e4ba59","Type":"ContainerStarted","Data":"f66aee83575c064dcd5dcc19a1a13fc20e0bee17af21d9cd61eee788cfc73701"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.495716 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.497107 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" event={"ID":"0152085f-c1f6-478c-9044-749eb51fad39","Type":"ContainerStarted","Data":"c4e28c1a1ddb5026cb9de7aef2b46b485bb0e8d847a4591a5e214e364bec6648"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.497252 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.498521 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" event={"ID":"8fce40f9-3595-4e54-816f-9e567e87ef4b","Type":"ContainerStarted","Data":"b24c9ee1beb4338e1a8c52d3f41a584e70bf5f8b55629dc16eff2794cc9c549b"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.498785 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.499873 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" event={"ID":"226147eb-5ae9-43a3-8d68-19115b510a2f","Type":"ContainerStarted","Data":"43c96c7c679d5f62569d7547c8a8e3d6f71013d8a6c6756dadba2d407816076f"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.500050 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.502114 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" event={"ID":"db9badf9-8fa3-484a-8ca4-ffa31c0c29c5","Type":"ContainerStarted","Data":"c1f875f982c33d3780f80c536608c259fd2553d193be1c0c4bd08a4a920d4d51"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.502355 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.503539 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" event={"ID":"822ac1df-18a3-4440-bd77-507c589ff693","Type":"ContainerStarted","Data":"d4d15fc8a0291439f6cfba0e8c587f99665bae81f1d75993ccc3eff9fee92ed7"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.504998 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" event={"ID":"dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f","Type":"ContainerStarted","Data":"5c260e11b52b51b815a0ad630ad0a0b7237f727bba018959109eb60923927eef"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.505077 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.506374 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" event={"ID":"6132cbf3-8a9f-4505-adcc-2e46beb5bf0e","Type":"ContainerStarted","Data":"d6c87d91a4794b6eb6264103db09663c48416c291d7cc70adaaaec439104030d"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.506764 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.508264 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" event={"ID":"4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d","Type":"ContainerStarted","Data":"462845d5e8d43171beb1a1b94cb4268ec704662b0d8b94074f8df941bc33295a"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.508433 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.509837 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" event={"ID":"f4d189a6-9923-41a3-be17-a18a76b9d382","Type":"ContainerStarted","Data":"67372c830f97971e6ab072e0afad51e43746e622128dbab4470afb31106b3427"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.511388 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" event={"ID":"5db5b7f8-cc13-42b5-9c72-87bf990091d2","Type":"ContainerStarted","Data":"d63447a73e9f272bc6e81f666af792f9ca44fa838de266a4694eb6b0cbed7a13"} Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.511556 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.567343 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" podStartSLOduration=3.907435244 podStartE2EDuration="30.567316537s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.764288368 +0000 UTC m=+1084.026175997" lastFinishedPulling="2025-12-16 07:09:32.424169651 +0000 UTC m=+1110.686057290" observedRunningTime="2025-12-16 07:09:33.560265655 +0000 UTC m=+1111.822153284" watchObservedRunningTime="2025-12-16 07:09:33.567316537 +0000 UTC m=+1111.829204176" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.604215 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" podStartSLOduration=3.977479546 podStartE2EDuration="30.604196638s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.758092377 +0000 UTC m=+1084.019980006" lastFinishedPulling="2025-12-16 07:09:32.384809469 +0000 UTC m=+1110.646697098" observedRunningTime="2025-12-16 07:09:33.603174153 +0000 UTC m=+1111.865061782" watchObservedRunningTime="2025-12-16 07:09:33.604196638 +0000 UTC m=+1111.866084267" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.762681 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" podStartSLOduration=4.128989989 podStartE2EDuration="30.762665211s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.760104436 +0000 UTC m=+1084.021992065" lastFinishedPulling="2025-12-16 07:09:32.393779658 +0000 UTC m=+1110.655667287" observedRunningTime="2025-12-16 07:09:33.762109477 +0000 UTC m=+1112.023997116" watchObservedRunningTime="2025-12-16 07:09:33.762665211 +0000 UTC m=+1112.024552840" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.765225 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" podStartSLOduration=4.607684858 podStartE2EDuration="30.765213343s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.104252039 +0000 UTC m=+1083.366139668" lastFinishedPulling="2025-12-16 07:09:31.261780514 +0000 UTC m=+1109.523668153" observedRunningTime="2025-12-16 07:09:33.743754939 +0000 UTC m=+1112.005642568" watchObservedRunningTime="2025-12-16 07:09:33.765213343 +0000 UTC m=+1112.027100972" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.827615 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.855099 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.841752 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" podStartSLOduration=4.921473595 podStartE2EDuration="30.841735313s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.341571468 +0000 UTC m=+1083.603459097" lastFinishedPulling="2025-12-16 07:09:31.261833186 +0000 UTC m=+1109.523720815" observedRunningTime="2025-12-16 07:09:33.835933581 +0000 UTC m=+1112.097821210" watchObservedRunningTime="2025-12-16 07:09:33.841735313 +0000 UTC m=+1112.103622942" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.860040 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" podStartSLOduration=6.18156892 podStartE2EDuration="30.86002202s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:04.475161495 +0000 UTC m=+1082.737049124" lastFinishedPulling="2025-12-16 07:09:29.153614595 +0000 UTC m=+1107.415502224" observedRunningTime="2025-12-16 07:09:33.816758963 +0000 UTC m=+1112.078646592" watchObservedRunningTime="2025-12-16 07:09:33.86002202 +0000 UTC m=+1112.121909649" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.884825 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" podStartSLOduration=4.969859647 podStartE2EDuration="30.884810675s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.346660062 +0000 UTC m=+1083.608547691" lastFinishedPulling="2025-12-16 07:09:31.26161109 +0000 UTC m=+1109.523498719" observedRunningTime="2025-12-16 07:09:33.874310799 +0000 UTC m=+1112.136198428" watchObservedRunningTime="2025-12-16 07:09:33.884810675 +0000 UTC m=+1112.146698304" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.932327 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" podStartSLOduration=5.066611562 podStartE2EDuration="30.932309336s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.395902486 +0000 UTC m=+1083.657790115" lastFinishedPulling="2025-12-16 07:09:31.26160026 +0000 UTC m=+1109.523487889" observedRunningTime="2025-12-16 07:09:33.912267797 +0000 UTC m=+1112.174155426" watchObservedRunningTime="2025-12-16 07:09:33.932309336 +0000 UTC m=+1112.194196965" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.935696 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" podStartSLOduration=4.273625093 podStartE2EDuration="30.935685789s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.722348863 +0000 UTC m=+1083.984236482" lastFinishedPulling="2025-12-16 07:09:32.384409559 +0000 UTC m=+1110.646297178" observedRunningTime="2025-12-16 07:09:33.931508307 +0000 UTC m=+1112.193395936" watchObservedRunningTime="2025-12-16 07:09:33.935685789 +0000 UTC m=+1112.197573438" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.956196 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" podStartSLOduration=4.311355296 podStartE2EDuration="30.95617926s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.739489553 +0000 UTC m=+1084.001377192" lastFinishedPulling="2025-12-16 07:09:32.384313527 +0000 UTC m=+1110.646201156" observedRunningTime="2025-12-16 07:09:33.953810632 +0000 UTC m=+1112.215698261" watchObservedRunningTime="2025-12-16 07:09:33.95617926 +0000 UTC m=+1112.218066889" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.981485 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" podStartSLOduration=5.441776321 podStartE2EDuration="30.981471028s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.721907363 +0000 UTC m=+1083.983794992" lastFinishedPulling="2025-12-16 07:09:31.26160207 +0000 UTC m=+1109.523489699" observedRunningTime="2025-12-16 07:09:33.977180303 +0000 UTC m=+1112.239067932" watchObservedRunningTime="2025-12-16 07:09:33.981471028 +0000 UTC m=+1112.243358657" Dec 16 07:09:33 crc kubenswrapper[4789]: I1216 07:09:33.998838 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" podStartSLOduration=5.081555497 podStartE2EDuration="30.998819262s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.344375496 +0000 UTC m=+1083.606263135" lastFinishedPulling="2025-12-16 07:09:31.261639271 +0000 UTC m=+1109.523526900" observedRunningTime="2025-12-16 07:09:33.994894366 +0000 UTC m=+1112.256781995" watchObservedRunningTime="2025-12-16 07:09:33.998819262 +0000 UTC m=+1112.260706891" Dec 16 07:09:34 crc kubenswrapper[4789]: I1216 07:09:34.017560 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" podStartSLOduration=6.283042319 podStartE2EDuration="31.017544649s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.653751767 +0000 UTC m=+1083.915639396" lastFinishedPulling="2025-12-16 07:09:30.388254097 +0000 UTC m=+1108.650141726" observedRunningTime="2025-12-16 07:09:34.016441832 +0000 UTC m=+1112.278329461" watchObservedRunningTime="2025-12-16 07:09:34.017544649 +0000 UTC m=+1112.279432278" Dec 16 07:09:34 crc kubenswrapper[4789]: I1216 07:09:34.518546 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" event={"ID":"a9aa6ddb-befe-472b-bbaf-c17285d7ade4","Type":"ContainerStarted","Data":"8fe11436a746d3b7367cfd315c243d428af9d9195b38bcf4d999a9217570eb04"} Dec 16 07:09:34 crc kubenswrapper[4789]: I1216 07:09:34.543208 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" podStartSLOduration=3.652935096 podStartE2EDuration="31.543191036s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.720771895 +0000 UTC m=+1083.982659524" lastFinishedPulling="2025-12-16 07:09:33.611027835 +0000 UTC m=+1111.872915464" observedRunningTime="2025-12-16 07:09:34.542365445 +0000 UTC m=+1112.804253074" watchObservedRunningTime="2025-12-16 07:09:34.543191036 +0000 UTC m=+1112.805078685" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.460430 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.468302 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d30a0974-7667-4999-9c46-3970ad1a6a8b-cert\") pod \"infra-operator-controller-manager-84b495f78-j97qq\" (UID: \"d30a0974-7667-4999-9c46-3970ad1a6a8b\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.742478 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.765390 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.771881 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a08c1d95-200f-40ce-abef-dbb505570602-cert\") pod \"openstack-baremetal-operator-controller-manager-66fff4bf6br444w\" (UID: \"a08c1d95-200f-40ce-abef-dbb505570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.967999 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.968375 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.972155 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-webhook-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:35 crc kubenswrapper[4789]: I1216 07:09:35.973582 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/467a702b-f3c6-42ef-ba9f-ec19e7d2a291-metrics-certs\") pod \"openstack-operator-controller-manager-678747d7fb-swjqj\" (UID: \"467a702b-f3c6-42ef-ba9f-ec19e7d2a291\") " pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:36 crc kubenswrapper[4789]: I1216 07:09:36.053293 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:36 crc kubenswrapper[4789]: I1216 07:09:36.076059 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:36 crc kubenswrapper[4789]: I1216 07:09:36.160127 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-j97qq"] Dec 16 07:09:36 crc kubenswrapper[4789]: W1216 07:09:36.164523 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd30a0974_7667_4999_9c46_3970ad1a6a8b.slice/crio-9d40d936dbb7abf88c2bc7462f3a1accf13a57a3eb44a20021c3f4b0ddd200ba WatchSource:0}: Error finding container 9d40d936dbb7abf88c2bc7462f3a1accf13a57a3eb44a20021c3f4b0ddd200ba: Status 404 returned error can't find the container with id 9d40d936dbb7abf88c2bc7462f3a1accf13a57a3eb44a20021c3f4b0ddd200ba Dec 16 07:09:36 crc kubenswrapper[4789]: I1216 07:09:36.483014 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w"] Dec 16 07:09:36 crc kubenswrapper[4789]: I1216 07:09:36.531801 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj"] Dec 16 07:09:36 crc kubenswrapper[4789]: I1216 07:09:36.532755 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" event={"ID":"a08c1d95-200f-40ce-abef-dbb505570602","Type":"ContainerStarted","Data":"2519b3796ff92583446a72814da12e29aaea5dfaf6a519cc65b848b42da1c16a"} Dec 16 07:09:36 crc kubenswrapper[4789]: I1216 07:09:36.533592 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" event={"ID":"d30a0974-7667-4999-9c46-3970ad1a6a8b","Type":"ContainerStarted","Data":"9d40d936dbb7abf88c2bc7462f3a1accf13a57a3eb44a20021c3f4b0ddd200ba"} Dec 16 07:09:36 crc kubenswrapper[4789]: W1216 07:09:36.542657 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467a702b_f3c6_42ef_ba9f_ec19e7d2a291.slice/crio-47c07ce7b427551428362b12bde8c64162376e71150c3217cd9dcf20e82a3edd WatchSource:0}: Error finding container 47c07ce7b427551428362b12bde8c64162376e71150c3217cd9dcf20e82a3edd: Status 404 returned error can't find the container with id 47c07ce7b427551428362b12bde8c64162376e71150c3217cd9dcf20e82a3edd Dec 16 07:09:37 crc kubenswrapper[4789]: I1216 07:09:37.541819 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" event={"ID":"467a702b-f3c6-42ef-ba9f-ec19e7d2a291","Type":"ContainerStarted","Data":"44db4aa48ae76d2e5409832caaf7f84d4ecae19092d962a46ac4c590fe7d3f9b"} Dec 16 07:09:37 crc kubenswrapper[4789]: I1216 07:09:37.542190 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:37 crc kubenswrapper[4789]: I1216 07:09:37.542221 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" event={"ID":"467a702b-f3c6-42ef-ba9f-ec19e7d2a291","Type":"ContainerStarted","Data":"47c07ce7b427551428362b12bde8c64162376e71150c3217cd9dcf20e82a3edd"} Dec 16 07:09:37 crc kubenswrapper[4789]: I1216 07:09:37.575300 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" podStartSLOduration=33.575280823 podStartE2EDuration="33.575280823s" podCreationTimestamp="2025-12-16 07:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:09:37.567636757 +0000 UTC m=+1115.829524386" watchObservedRunningTime="2025-12-16 07:09:37.575280823 +0000 UTC m=+1115.837168452" Dec 16 07:09:39 crc kubenswrapper[4789]: I1216 07:09:39.554411 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" event={"ID":"a08c1d95-200f-40ce-abef-dbb505570602","Type":"ContainerStarted","Data":"f84aca192efdbd3aeee6d5a868ee58aa0f2a9a356436542628745a388a16f406"} Dec 16 07:09:39 crc kubenswrapper[4789]: I1216 07:09:39.555080 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:39 crc kubenswrapper[4789]: I1216 07:09:39.555605 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" event={"ID":"d30a0974-7667-4999-9c46-3970ad1a6a8b","Type":"ContainerStarted","Data":"1069d18137b7f1220f381ef3ec7a5beedf67b3dc0f36a5ed8a41f9dacd779dca"} Dec 16 07:09:39 crc kubenswrapper[4789]: I1216 07:09:39.555773 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:39 crc kubenswrapper[4789]: I1216 07:09:39.605694 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" podStartSLOduration=33.445448391 podStartE2EDuration="36.605672781s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:36.168134075 +0000 UTC m=+1114.430021704" lastFinishedPulling="2025-12-16 07:09:39.328358465 +0000 UTC m=+1117.590246094" observedRunningTime="2025-12-16 07:09:39.59779408 +0000 UTC m=+1117.859681729" watchObservedRunningTime="2025-12-16 07:09:39.605672781 +0000 UTC m=+1117.867560410" Dec 16 07:09:39 crc kubenswrapper[4789]: I1216 07:09:39.606165 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" podStartSLOduration=33.770587479 podStartE2EDuration="36.606156474s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:36.48422862 +0000 UTC m=+1114.746116249" lastFinishedPulling="2025-12-16 07:09:39.319797615 +0000 UTC m=+1117.581685244" observedRunningTime="2025-12-16 07:09:39.582729781 +0000 UTC m=+1117.844617410" watchObservedRunningTime="2025-12-16 07:09:39.606156474 +0000 UTC m=+1117.868044103" Dec 16 07:09:40 crc kubenswrapper[4789]: I1216 07:09:40.564701 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" event={"ID":"5d5d9279-e35b-4b95-be8e-dc54a056e7b5","Type":"ContainerStarted","Data":"7ceb976f5af2e8fce6a6f2af5edbceb25fd261a666c9792e503ab9960c32559a"} Dec 16 07:09:40 crc kubenswrapper[4789]: I1216 07:09:40.566180 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" Dec 16 07:09:40 crc kubenswrapper[4789]: I1216 07:09:40.567933 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" event={"ID":"cc2943a0-fd8f-49bd-bf85-aa6fb274e999","Type":"ContainerStarted","Data":"1244f85a5b078a9accd1b6eb88abc2c3fb92613ac9177149629a5928b7d31f3b"} Dec 16 07:09:40 crc kubenswrapper[4789]: I1216 07:09:40.568238 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" Dec 16 07:09:40 crc kubenswrapper[4789]: I1216 07:09:40.632352 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" podStartSLOduration=3.690149295 podStartE2EDuration="37.632330452s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.679097716 +0000 UTC m=+1083.940985345" lastFinishedPulling="2025-12-16 07:09:39.621278863 +0000 UTC m=+1117.883166502" observedRunningTime="2025-12-16 07:09:40.61427359 +0000 UTC m=+1118.876161219" watchObservedRunningTime="2025-12-16 07:09:40.632330452 +0000 UTC m=+1118.894218081" Dec 16 07:09:40 crc kubenswrapper[4789]: I1216 07:09:40.648580 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" podStartSLOduration=2.096393506 podStartE2EDuration="37.648562058s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:04.474519129 +0000 UTC m=+1082.736406758" lastFinishedPulling="2025-12-16 07:09:40.026687691 +0000 UTC m=+1118.288575310" observedRunningTime="2025-12-16 07:09:40.643315799 +0000 UTC m=+1118.905203428" watchObservedRunningTime="2025-12-16 07:09:40.648562058 +0000 UTC m=+1118.910449707" Dec 16 07:09:41 crc kubenswrapper[4789]: I1216 07:09:41.576070 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" event={"ID":"826f108e-bfd8-43bb-8719-d9a569778578","Type":"ContainerStarted","Data":"4faa8eb4d2b770f5054b9d977d0e1ac984fc0704a51451b5c41bd79ee5be4c40"} Dec 16 07:09:41 crc kubenswrapper[4789]: I1216 07:09:41.607276 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" podStartSLOduration=3.37529631 podStartE2EDuration="38.607261157s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.674559795 +0000 UTC m=+1083.936447424" lastFinishedPulling="2025-12-16 07:09:40.906524642 +0000 UTC m=+1119.168412271" observedRunningTime="2025-12-16 07:09:41.601540127 +0000 UTC m=+1119.863427756" watchObservedRunningTime="2025-12-16 07:09:41.607261157 +0000 UTC m=+1119.869148786" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.594105 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" event={"ID":"d03193ca-0584-4d77-bab4-5e42abf5b5b5","Type":"ContainerStarted","Data":"2ae3cd858e3f11ac483ec65edeabe221fea388969f1edd66907249041f08093c"} Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.594706 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.612131 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" podStartSLOduration=3.57619549 podStartE2EDuration="40.612115371s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.681746461 +0000 UTC m=+1083.943634090" lastFinishedPulling="2025-12-16 07:09:42.717666342 +0000 UTC m=+1120.979553971" observedRunningTime="2025-12-16 07:09:43.606815291 +0000 UTC m=+1121.868702920" watchObservedRunningTime="2025-12-16 07:09:43.612115371 +0000 UTC m=+1121.874003000" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.758772 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-95949466-hlm8z" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.798547 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-gkhql" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.834475 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-bm8v5" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.847867 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-2t6n8" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.856288 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-72lbq" Dec 16 07:09:43 crc kubenswrapper[4789]: I1216 07:09:43.909021 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-8wxm6" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.021233 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-f46p9" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.049236 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-pxvzs" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.133828 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-p29cz" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.137514 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-jfxhw" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.254008 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.274901 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-b6v9v" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.324808 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.327216 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-f6ndr" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.345451 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-8dfvw" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.376989 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-8vdpj" Dec 16 07:09:44 crc kubenswrapper[4789]: I1216 07:09:44.463654 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-4xsfm" Dec 16 07:09:45 crc kubenswrapper[4789]: I1216 07:09:45.607648 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" event={"ID":"c7268550-e5d4-4664-b04d-ecfa498cb475","Type":"ContainerStarted","Data":"f3cf70dbb737ecfb201d8e5f960b11a098525ab592f5648c9590a4f790cd1c83"} Dec 16 07:09:45 crc kubenswrapper[4789]: I1216 07:09:45.608238 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" Dec 16 07:09:45 crc kubenswrapper[4789]: I1216 07:09:45.609369 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" event={"ID":"26befb39-90f5-4fa1-8f8a-3b82ebae6472","Type":"ContainerStarted","Data":"be391f4c042b022b5b40315201a75ef602a238cf5159e177a1eb7f49cf914e4d"} Dec 16 07:09:45 crc kubenswrapper[4789]: I1216 07:09:45.629455 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" podStartSLOduration=3.775873099 podStartE2EDuration="42.62943553s" podCreationTimestamp="2025-12-16 07:09:03 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.719102484 +0000 UTC m=+1083.980990113" lastFinishedPulling="2025-12-16 07:09:44.572664915 +0000 UTC m=+1122.834552544" observedRunningTime="2025-12-16 07:09:45.623657598 +0000 UTC m=+1123.885545247" watchObservedRunningTime="2025-12-16 07:09:45.62943553 +0000 UTC m=+1123.891323159" Dec 16 07:09:45 crc kubenswrapper[4789]: I1216 07:09:45.644923 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k7fwx" podStartSLOduration=2.813786747 podStartE2EDuration="41.644890258s" podCreationTimestamp="2025-12-16 07:09:04 +0000 UTC" firstStartedPulling="2025-12-16 07:09:05.743689085 +0000 UTC m=+1084.005576714" lastFinishedPulling="2025-12-16 07:09:44.574792596 +0000 UTC m=+1122.836680225" observedRunningTime="2025-12-16 07:09:45.642899829 +0000 UTC m=+1123.904787458" watchObservedRunningTime="2025-12-16 07:09:45.644890258 +0000 UTC m=+1123.906777887" Dec 16 07:09:45 crc kubenswrapper[4789]: I1216 07:09:45.750436 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84b495f78-j97qq" Dec 16 07:09:46 crc kubenswrapper[4789]: I1216 07:09:46.059478 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-66fff4bf6br444w" Dec 16 07:09:46 crc kubenswrapper[4789]: I1216 07:09:46.084019 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-678747d7fb-swjqj" Dec 16 07:09:53 crc kubenswrapper[4789]: I1216 07:09:53.773856 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-g2w7k" Dec 16 07:09:54 crc kubenswrapper[4789]: I1216 07:09:54.160137 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-d6bcb" Dec 16 07:09:54 crc kubenswrapper[4789]: I1216 07:09:54.230079 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-d99lr" Dec 16 07:09:54 crc kubenswrapper[4789]: I1216 07:09:54.255422 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-6ztbz" Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.877568 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-pq2b2"] Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.880110 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.882272 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.882629 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2nfkh" Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.885783 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.886016 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.920437 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-pq2b2"] Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.962625 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b2fd\" (UniqueName: \"kubernetes.io/projected/80eee5ab-8219-48e4-98ae-7be381adbc39-kube-api-access-6b2fd\") pod \"dnsmasq-dns-84bb9d8bd9-pq2b2\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:09 crc kubenswrapper[4789]: I1216 07:10:09.962741 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80eee5ab-8219-48e4-98ae-7be381adbc39-config\") pod \"dnsmasq-dns-84bb9d8bd9-pq2b2\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.010105 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-f8w4t"] Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.011489 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.013425 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.025362 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-f8w4t"] Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.064215 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b2fd\" (UniqueName: \"kubernetes.io/projected/80eee5ab-8219-48e4-98ae-7be381adbc39-kube-api-access-6b2fd\") pod \"dnsmasq-dns-84bb9d8bd9-pq2b2\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.064340 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80eee5ab-8219-48e4-98ae-7be381adbc39-config\") pod \"dnsmasq-dns-84bb9d8bd9-pq2b2\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.065397 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80eee5ab-8219-48e4-98ae-7be381adbc39-config\") pod \"dnsmasq-dns-84bb9d8bd9-pq2b2\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.092844 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b2fd\" (UniqueName: \"kubernetes.io/projected/80eee5ab-8219-48e4-98ae-7be381adbc39-kube-api-access-6b2fd\") pod \"dnsmasq-dns-84bb9d8bd9-pq2b2\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.165158 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-dns-svc\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.165224 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-config\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.165326 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj59z\" (UniqueName: \"kubernetes.io/projected/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-kube-api-access-mj59z\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.202221 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.267148 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj59z\" (UniqueName: \"kubernetes.io/projected/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-kube-api-access-mj59z\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.267304 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-dns-svc\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.268442 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-config\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.268811 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-dns-svc\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.269307 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-config\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.286998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj59z\" (UniqueName: \"kubernetes.io/projected/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-kube-api-access-mj59z\") pod \"dnsmasq-dns-5f854695bc-f8w4t\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.327162 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.687524 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.687707 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-pq2b2"] Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.750188 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-f8w4t"] Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.796456 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" event={"ID":"80eee5ab-8219-48e4-98ae-7be381adbc39","Type":"ContainerStarted","Data":"6ee590e7e19f480d576bc052f37653135fefb57173943839012788563bd31599"} Dec 16 07:10:10 crc kubenswrapper[4789]: I1216 07:10:10.797783 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" event={"ID":"b3c2ad40-ba60-4f27-af90-7e7455ff4f85","Type":"ContainerStarted","Data":"fdf90b3fc166cfa22823f6e392d56ba177d5735a9af8d85ab0bbe5613c59d9fe"} Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.436502 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-f8w4t"] Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.457573 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-4lkfh"] Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.458890 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.476145 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-4lkfh"] Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.597885 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-config\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.598088 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.598169 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5576\" (UniqueName: \"kubernetes.io/projected/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-kube-api-access-z5576\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.699424 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-config\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.699549 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.699595 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5576\" (UniqueName: \"kubernetes.io/projected/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-kube-api-access-z5576\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.700776 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-config\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.701349 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-dns-svc\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.737047 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5576\" (UniqueName: \"kubernetes.io/projected/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-kube-api-access-z5576\") pod \"dnsmasq-dns-c7cbb8f79-4lkfh\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.777127 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.815440 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-pq2b2"] Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.862123 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-kvmbq"] Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.864012 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:12 crc kubenswrapper[4789]: I1216 07:10:12.881836 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-kvmbq"] Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.008184 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-dns-svc\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.008374 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhkk\" (UniqueName: \"kubernetes.io/projected/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-kube-api-access-mvhkk\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.008445 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-config\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.109266 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-dns-svc\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.109360 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhkk\" (UniqueName: \"kubernetes.io/projected/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-kube-api-access-mvhkk\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.109402 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-config\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.110461 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-config\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.110846 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-dns-svc\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.139769 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhkk\" (UniqueName: \"kubernetes.io/projected/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-kube-api-access-mvhkk\") pod \"dnsmasq-dns-95f5f6995-kvmbq\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.214374 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.342558 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-4lkfh"] Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.669276 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.670576 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.675546 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.675595 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.675737 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.675817 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.675864 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.675753 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.676017 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.676081 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nwjj9" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.681660 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-kvmbq"] Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821453 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821502 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821528 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821596 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9452e1b2-42ec-47b6-96e1-2770c9e76db2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821626 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821675 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821733 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9452e1b2-42ec-47b6-96e1-2770c9e76db2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821754 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821868 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821905 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxnm\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-kube-api-access-9pxnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.821965 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.913499 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" event={"ID":"ec04a3ac-e0cd-491a-8af6-c2bbfaece281","Type":"ContainerStarted","Data":"f22a90872cb03df967e184a76a05de17f370ef5f2d425e1e334df3b15058b97c"} Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.916646 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" event={"ID":"ae0e1a2e-99bd-4973-8482-685c1b9d2fee","Type":"ContainerStarted","Data":"b725897d3c919baaa39ff1937d91a1f935c96db914f4e8dbf442742490b949fc"} Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924509 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924579 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pxnm\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-kube-api-access-9pxnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924606 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924636 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924666 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924690 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924707 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9452e1b2-42ec-47b6-96e1-2770c9e76db2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924746 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924767 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924798 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9452e1b2-42ec-47b6-96e1-2770c9e76db2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.924814 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.926951 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.928281 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.929328 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.929443 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.930451 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.932448 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.933242 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.934836 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9452e1b2-42ec-47b6-96e1-2770c9e76db2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.946593 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.949399 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9452e1b2-42ec-47b6-96e1-2770c9e76db2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.956863 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pxnm\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-kube-api-access-9pxnm\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.966610 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.970999 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.973034 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.982211 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.982498 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.982686 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.982800 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.982695 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b9crk" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.982743 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.983161 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.992809 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:10:13 crc kubenswrapper[4789]: I1216 07:10:13.997974 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.127766 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxw8s\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-kube-api-access-wxw8s\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.127847 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.127874 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.127899 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.127929 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31336d9f-38cf-4805-927b-3ae986f6c88e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.127958 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.127974 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.128002 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31336d9f-38cf-4805-927b-3ae986f6c88e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.128020 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.128033 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.128061 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230012 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxw8s\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-kube-api-access-wxw8s\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230054 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230072 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230099 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230113 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31336d9f-38cf-4805-927b-3ae986f6c88e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230138 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230155 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230184 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31336d9f-38cf-4805-927b-3ae986f6c88e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230203 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230228 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.230249 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.231217 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.231256 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.231371 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.231426 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.232336 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.232582 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.236240 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31336d9f-38cf-4805-927b-3ae986f6c88e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.236760 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31336d9f-38cf-4805-927b-3ae986f6c88e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.237257 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.238982 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.247865 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxw8s\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-kube-api-access-wxw8s\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.262680 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.387528 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.504686 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:10:14 crc kubenswrapper[4789]: W1216 07:10:14.519202 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9452e1b2_42ec_47b6_96e1_2770c9e76db2.slice/crio-5c4015381169ad0d393422d1b91124ef0e4d0ced98d367a59839b4ee28aa3294 WatchSource:0}: Error finding container 5c4015381169ad0d393422d1b91124ef0e4d0ced98d367a59839b4ee28aa3294: Status 404 returned error can't find the container with id 5c4015381169ad0d393422d1b91124ef0e4d0ced98d367a59839b4ee28aa3294 Dec 16 07:10:14 crc kubenswrapper[4789]: I1216 07:10:14.932683 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9452e1b2-42ec-47b6-96e1-2770c9e76db2","Type":"ContainerStarted","Data":"5c4015381169ad0d393422d1b91124ef0e4d0ced98d367a59839b4ee28aa3294"} Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.043506 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.045311 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.047666 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5nxcs" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.048247 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.048496 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.048727 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.054617 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.055425 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.146350 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.146752 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.146810 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.146975 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.147016 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.147079 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.147123 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2djvc\" (UniqueName: \"kubernetes.io/projected/d868c627-a661-4c69-afd7-26d88b2be0ec-kube-api-access-2djvc\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.147148 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.149577 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249102 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249156 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2djvc\" (UniqueName: \"kubernetes.io/projected/d868c627-a661-4c69-afd7-26d88b2be0ec-kube-api-access-2djvc\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249200 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249281 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249361 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249428 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249485 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249518 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249552 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.249931 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.250642 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.251025 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.254851 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.255684 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.258027 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.269828 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2djvc\" (UniqueName: \"kubernetes.io/projected/d868c627-a661-4c69-afd7-26d88b2be0ec-kube-api-access-2djvc\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.275873 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.430185 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:10:15 crc kubenswrapper[4789]: I1216 07:10:15.947836 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31336d9f-38cf-4805-927b-3ae986f6c88e","Type":"ContainerStarted","Data":"664e34f45fd05d46960a179707cf44e99059311b9833755a4dd8fc792989bfe9"} Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.060989 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.422365 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.423593 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.429259 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.430536 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.430860 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.431092 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qrvnj" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.436431 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.480900 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.480988 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.481019 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ncrb\" (UniqueName: \"kubernetes.io/projected/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kube-api-access-6ncrb\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.481053 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.481123 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.481145 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.481167 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.481185 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582637 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582687 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ncrb\" (UniqueName: \"kubernetes.io/projected/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kube-api-access-6ncrb\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582725 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582771 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582792 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582816 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582834 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.582861 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.583257 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.583680 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.583902 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.584496 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.585462 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.595478 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.599001 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.602381 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ncrb\" (UniqueName: \"kubernetes.io/projected/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kube-api-access-6ncrb\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.636995 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.745422 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.901761 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.902737 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.906131 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.906651 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-jpz4k" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.907051 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.915073 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.994742 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.994801 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.994826 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-config-data\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.995021 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-kolla-config\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:16 crc kubenswrapper[4789]: I1216 07:10:16.995059 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4762f\" (UniqueName: \"kubernetes.io/projected/db3b1c91-5558-4afb-a9fc-dd75527451ee-kube-api-access-4762f\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.002943 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d868c627-a661-4c69-afd7-26d88b2be0ec","Type":"ContainerStarted","Data":"5739231fd2be0a2e5e2b3764d4bc5bab7c8d3c4dae5e2c3b256d50500c494c79"} Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.098006 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-kolla-config\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.098304 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4762f\" (UniqueName: \"kubernetes.io/projected/db3b1c91-5558-4afb-a9fc-dd75527451ee-kube-api-access-4762f\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.098332 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.098364 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.098381 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-config-data\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.099135 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-config-data\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.101296 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-kolla-config\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.105394 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.113284 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.124490 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4762f\" (UniqueName: \"kubernetes.io/projected/db3b1c91-5558-4afb-a9fc-dd75527451ee-kube-api-access-4762f\") pod \"memcached-0\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.235281 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.332160 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:10:17 crc kubenswrapper[4789]: I1216 07:10:17.737622 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.015767 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f699c71b-1e44-4a4d-b1fb-77ef105af03d","Type":"ContainerStarted","Data":"e3b3746f35fb16f83fe0120e8d75eb4fa7a73e27440d42530fadd50fe2af8eae"} Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.516524 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.517421 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.525287 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-r67dq" Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.540886 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.625408 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxxmz\" (UniqueName: \"kubernetes.io/projected/02b0fdb4-d395-4464-8250-4288ca50c8de-kube-api-access-mxxmz\") pod \"kube-state-metrics-0\" (UID: \"02b0fdb4-d395-4464-8250-4288ca50c8de\") " pod="openstack/kube-state-metrics-0" Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.726811 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxxmz\" (UniqueName: \"kubernetes.io/projected/02b0fdb4-d395-4464-8250-4288ca50c8de-kube-api-access-mxxmz\") pod \"kube-state-metrics-0\" (UID: \"02b0fdb4-d395-4464-8250-4288ca50c8de\") " pod="openstack/kube-state-metrics-0" Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.744780 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxxmz\" (UniqueName: \"kubernetes.io/projected/02b0fdb4-d395-4464-8250-4288ca50c8de-kube-api-access-mxxmz\") pod \"kube-state-metrics-0\" (UID: \"02b0fdb4-d395-4464-8250-4288ca50c8de\") " pod="openstack/kube-state-metrics-0" Dec 16 07:10:18 crc kubenswrapper[4789]: I1216 07:10:18.866471 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:10:20 crc kubenswrapper[4789]: W1216 07:10:20.869721 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb3b1c91_5558_4afb_a9fc_dd75527451ee.slice/crio-49f1a7097b5b2b7be517e1823df3bb70c8861aa3dd39af5ac54e40693abb09fe WatchSource:0}: Error finding container 49f1a7097b5b2b7be517e1823df3bb70c8861aa3dd39af5ac54e40693abb09fe: Status 404 returned error can't find the container with id 49f1a7097b5b2b7be517e1823df3bb70c8861aa3dd39af5ac54e40693abb09fe Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.051683 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"db3b1c91-5558-4afb-a9fc-dd75527451ee","Type":"ContainerStarted","Data":"49f1a7097b5b2b7be517e1823df3bb70c8861aa3dd39af5ac54e40693abb09fe"} Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.728160 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cw7z9"] Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.729106 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.731074 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.731256 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.734190 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tgmgm" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.760064 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9"] Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.781148 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-log-ovn\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.781204 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run-ovn\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.781288 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-scripts\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.781322 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-combined-ca-bundle\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.781363 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5q99\" (UniqueName: \"kubernetes.io/projected/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-kube-api-access-v5q99\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.781393 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-ovn-controller-tls-certs\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.781417 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.795666 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tblns"] Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.797443 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.803345 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tblns"] Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882277 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5q99\" (UniqueName: \"kubernetes.io/projected/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-kube-api-access-v5q99\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882324 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-lib\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882342 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-ovn-controller-tls-certs\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882364 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882379 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5429404-d973-4580-961a-8ad6081e93ec-scripts\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882404 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-log\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882433 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-run\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882451 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-log-ovn\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882471 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run-ovn\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882489 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bfrk\" (UniqueName: \"kubernetes.io/projected/b5429404-d973-4580-961a-8ad6081e93ec-kube-api-access-5bfrk\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882524 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-etc-ovs\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882558 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-scripts\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.882582 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-combined-ca-bundle\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.883645 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-log-ovn\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.884678 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.884719 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run-ovn\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.885031 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-scripts\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.886387 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-ovn-controller-tls-certs\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.886424 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-combined-ca-bundle\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.904650 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5q99\" (UniqueName: \"kubernetes.io/projected/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-kube-api-access-v5q99\") pod \"ovn-controller-cw7z9\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.983797 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-etc-ovs\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.983962 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-lib\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.984017 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5429404-d973-4580-961a-8ad6081e93ec-scripts\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.984087 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-log\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.984130 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-run\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.984189 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bfrk\" (UniqueName: \"kubernetes.io/projected/b5429404-d973-4580-961a-8ad6081e93ec-kube-api-access-5bfrk\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.984837 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-run\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.984838 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-etc-ovs\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.984988 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-lib\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.985059 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-log\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:21 crc kubenswrapper[4789]: I1216 07:10:21.986387 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5429404-d973-4580-961a-8ad6081e93ec-scripts\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:22 crc kubenswrapper[4789]: I1216 07:10:22.022477 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bfrk\" (UniqueName: \"kubernetes.io/projected/b5429404-d973-4580-961a-8ad6081e93ec-kube-api-access-5bfrk\") pod \"ovn-controller-ovs-tblns\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:22 crc kubenswrapper[4789]: I1216 07:10:22.047643 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:22 crc kubenswrapper[4789]: I1216 07:10:22.151506 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.500310 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.501944 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.503883 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-stc9h" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.504148 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.504153 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.504184 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.506413 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.551958 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.552055 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.552174 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.552239 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkcjb\" (UniqueName: \"kubernetes.io/projected/6456012f-c7be-458c-a9a5-b3958ae72c2c-kube-api-access-fkcjb\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.552322 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.552419 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.552457 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.552480 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-config\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.568261 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654155 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654219 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654256 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-config\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654296 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654335 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654355 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654377 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkcjb\" (UniqueName: \"kubernetes.io/projected/6456012f-c7be-458c-a9a5-b3958ae72c2c-kube-api-access-fkcjb\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.654411 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.655056 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.655707 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-config\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.655750 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.656162 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.660110 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.661970 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.665615 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.674821 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkcjb\" (UniqueName: \"kubernetes.io/projected/6456012f-c7be-458c-a9a5-b3958ae72c2c-kube-api-access-fkcjb\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.683045 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.697638 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.698786 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.701083 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kschc" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.701232 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.701330 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.701797 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.708952 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756334 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756376 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756395 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756421 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6mk\" (UniqueName: \"kubernetes.io/projected/2fceb99a-9dfd-4d79-a0fd-666390de4440-kube-api-access-gl6mk\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756557 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756619 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-config\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756698 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.756722 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.839543 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858150 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-config\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858237 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858271 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858301 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858326 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858350 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858386 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6mk\" (UniqueName: \"kubernetes.io/projected/2fceb99a-9dfd-4d79-a0fd-666390de4440-kube-api-access-gl6mk\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.858448 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.859055 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-config\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.859478 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.859622 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.860106 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.862969 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.863620 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.864600 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.881127 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:25 crc kubenswrapper[4789]: I1216 07:10:25.881267 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6mk\" (UniqueName: \"kubernetes.io/projected/2fceb99a-9dfd-4d79-a0fd-666390de4440-kube-api-access-gl6mk\") pod \"ovsdbserver-sb-0\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:26 crc kubenswrapper[4789]: I1216 07:10:26.038695 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:30 crc kubenswrapper[4789]: E1216 07:10:30.151051 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13" Dec 16 07:10:30 crc kubenswrapper[4789]: E1216 07:10:30.151562 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2djvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(d868c627-a661-4c69-afd7-26d88b2be0ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:10:30 crc kubenswrapper[4789]: E1216 07:10:30.152857 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" Dec 16 07:10:30 crc kubenswrapper[4789]: E1216 07:10:30.165194 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13\\\"\"" pod="openstack/openstack-galera-0" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" Dec 16 07:10:39 crc kubenswrapper[4789]: I1216 07:10:39.643046 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.529611 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.530150 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mj59z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-f8w4t_openstack(b3c2ad40-ba60-4f27-af90-7e7455ff4f85): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.531499 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" podUID="b3c2ad40-ba60-4f27-af90-7e7455ff4f85" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.539280 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.539502 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5576,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-c7cbb8f79-4lkfh_openstack(ec04a3ac-e0cd-491a-8af6-c2bbfaece281): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.540687 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" podUID="ec04a3ac-e0cd-491a-8af6-c2bbfaece281" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.552218 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.552410 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvhkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95f5f6995-kvmbq_openstack(ae0e1a2e-99bd-4973-8482-685c1b9d2fee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.555303 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" podUID="ae0e1a2e-99bd-4973-8482-685c1b9d2fee" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.561242 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.562223 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6b2fd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-pq2b2_openstack(80eee5ab-8219-48e4-98ae-7be381adbc39): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:10:40 crc kubenswrapper[4789]: E1216 07:10:40.563896 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" podUID="80eee5ab-8219-48e4-98ae-7be381adbc39" Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.072044 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.134143 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9"] Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.154832 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.247557 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tblns"] Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.249112 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6456012f-c7be-458c-a9a5-b3958ae72c2c","Type":"ContainerStarted","Data":"5187726476a931cbb4c315efec1f0ad6089e99939e1b36488eb5216377147f30"} Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.250422 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"db3b1c91-5558-4afb-a9fc-dd75527451ee","Type":"ContainerStarted","Data":"530a5d65e6717bf5cb2f7088a0efb5a0b81a55b156e61bc16cea37ce6d84bbe3"} Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.250552 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.251838 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f699c71b-1e44-4a4d-b1fb-77ef105af03d","Type":"ContainerStarted","Data":"71816cb1e1dc225cc4c62c55a8064e75aa79bd2df0d364078f17dc7c513e994f"} Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.252856 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"02b0fdb4-d395-4464-8250-4288ca50c8de","Type":"ContainerStarted","Data":"f63dd86fd6f300df198f61c2064f8ec952090ae09cee0dcd1fdf84aa6369e100"} Dec 16 07:10:41 crc kubenswrapper[4789]: E1216 07:10:41.254173 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" podUID="ae0e1a2e-99bd-4973-8482-685c1b9d2fee" Dec 16 07:10:41 crc kubenswrapper[4789]: E1216 07:10:41.256572 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" podUID="ec04a3ac-e0cd-491a-8af6-c2bbfaece281" Dec 16 07:10:41 crc kubenswrapper[4789]: W1216 07:10:41.271825 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e16a3ef_920e_493a_ae2f_7336d64bbd7e.slice/crio-5dbcf4faa066d42f7701a1f9fed6538be3aac4a93eaef6d495518a71ebff75bb WatchSource:0}: Error finding container 5dbcf4faa066d42f7701a1f9fed6538be3aac4a93eaef6d495518a71ebff75bb: Status 404 returned error can't find the container with id 5dbcf4faa066d42f7701a1f9fed6538be3aac4a93eaef6d495518a71ebff75bb Dec 16 07:10:41 crc kubenswrapper[4789]: I1216 07:10:41.275029 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=5.586876242 podStartE2EDuration="25.275008572s" podCreationTimestamp="2025-12-16 07:10:16 +0000 UTC" firstStartedPulling="2025-12-16 07:10:20.877673854 +0000 UTC m=+1159.139561483" lastFinishedPulling="2025-12-16 07:10:40.565806174 +0000 UTC m=+1178.827693813" observedRunningTime="2025-12-16 07:10:41.266956027 +0000 UTC m=+1179.528843666" watchObservedRunningTime="2025-12-16 07:10:41.275008572 +0000 UTC m=+1179.536896211" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.027596 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.036645 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.204741 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-config\") pod \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.204831 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-dns-svc\") pod \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.205066 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj59z\" (UniqueName: \"kubernetes.io/projected/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-kube-api-access-mj59z\") pod \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\" (UID: \"b3c2ad40-ba60-4f27-af90-7e7455ff4f85\") " Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.205497 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3c2ad40-ba60-4f27-af90-7e7455ff4f85" (UID: "b3c2ad40-ba60-4f27-af90-7e7455ff4f85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.205946 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80eee5ab-8219-48e4-98ae-7be381adbc39-config\") pod \"80eee5ab-8219-48e4-98ae-7be381adbc39\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.205980 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b2fd\" (UniqueName: \"kubernetes.io/projected/80eee5ab-8219-48e4-98ae-7be381adbc39-kube-api-access-6b2fd\") pod \"80eee5ab-8219-48e4-98ae-7be381adbc39\" (UID: \"80eee5ab-8219-48e4-98ae-7be381adbc39\") " Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.206390 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.206699 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-config" (OuterVolumeSpecName: "config") pod "b3c2ad40-ba60-4f27-af90-7e7455ff4f85" (UID: "b3c2ad40-ba60-4f27-af90-7e7455ff4f85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.207526 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80eee5ab-8219-48e4-98ae-7be381adbc39-config" (OuterVolumeSpecName: "config") pod "80eee5ab-8219-48e4-98ae-7be381adbc39" (UID: "80eee5ab-8219-48e4-98ae-7be381adbc39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.212282 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-kube-api-access-mj59z" (OuterVolumeSpecName: "kube-api-access-mj59z") pod "b3c2ad40-ba60-4f27-af90-7e7455ff4f85" (UID: "b3c2ad40-ba60-4f27-af90-7e7455ff4f85"). InnerVolumeSpecName "kube-api-access-mj59z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.214369 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80eee5ab-8219-48e4-98ae-7be381adbc39-kube-api-access-6b2fd" (OuterVolumeSpecName: "kube-api-access-6b2fd") pod "80eee5ab-8219-48e4-98ae-7be381adbc39" (UID: "80eee5ab-8219-48e4-98ae-7be381adbc39"). InnerVolumeSpecName "kube-api-access-6b2fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.263643 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2fceb99a-9dfd-4d79-a0fd-666390de4440","Type":"ContainerStarted","Data":"05f5fc2d0ad0e205349987b78f5f54ae34737edef7c39e28b3608e88243874d9"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.267943 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31336d9f-38cf-4805-927b-3ae986f6c88e","Type":"ContainerStarted","Data":"ff767a7cabcaa4f9752eac58d5657fbc09c94d5629fc004f9ab8e06f780b0a62"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.271569 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d868c627-a661-4c69-afd7-26d88b2be0ec","Type":"ContainerStarted","Data":"d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.274432 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tblns" event={"ID":"b5429404-d973-4580-961a-8ad6081e93ec","Type":"ContainerStarted","Data":"1e9ca768581b07a47cbe4eb52da147b3dc82157f500ff347c14e7e5647f13dd3"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.275204 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9" event={"ID":"1e16a3ef-920e-493a-ae2f-7336d64bbd7e","Type":"ContainerStarted","Data":"5dbcf4faa066d42f7701a1f9fed6538be3aac4a93eaef6d495518a71ebff75bb"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.276710 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9452e1b2-42ec-47b6-96e1-2770c9e76db2","Type":"ContainerStarted","Data":"e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.278522 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" event={"ID":"b3c2ad40-ba60-4f27-af90-7e7455ff4f85","Type":"ContainerDied","Data":"fdf90b3fc166cfa22823f6e392d56ba177d5735a9af8d85ab0bbe5613c59d9fe"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.278568 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-f8w4t" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.289630 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.289634 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-pq2b2" event={"ID":"80eee5ab-8219-48e4-98ae-7be381adbc39","Type":"ContainerDied","Data":"6ee590e7e19f480d576bc052f37653135fefb57173943839012788563bd31599"} Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.324580 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj59z\" (UniqueName: \"kubernetes.io/projected/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-kube-api-access-mj59z\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.324646 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80eee5ab-8219-48e4-98ae-7be381adbc39-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.324679 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b2fd\" (UniqueName: \"kubernetes.io/projected/80eee5ab-8219-48e4-98ae-7be381adbc39-kube-api-access-6b2fd\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.324892 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3c2ad40-ba60-4f27-af90-7e7455ff4f85-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.403245 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-f8w4t"] Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.446468 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-f8w4t"] Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.466040 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-pq2b2"] Dec 16 07:10:42 crc kubenswrapper[4789]: I1216 07:10:42.471284 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-pq2b2"] Dec 16 07:10:44 crc kubenswrapper[4789]: I1216 07:10:44.114130 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80eee5ab-8219-48e4-98ae-7be381adbc39" path="/var/lib/kubelet/pods/80eee5ab-8219-48e4-98ae-7be381adbc39/volumes" Dec 16 07:10:44 crc kubenswrapper[4789]: I1216 07:10:44.115736 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c2ad40-ba60-4f27-af90-7e7455ff4f85" path="/var/lib/kubelet/pods/b3c2ad40-ba60-4f27-af90-7e7455ff4f85/volumes" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.182723 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ghcvz"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.184018 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.186447 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.198274 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ghcvz"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.268451 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23503f0-7f00-4d2d-830b-fed7db6e6a08-config\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.268492 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovs-rundir\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.268512 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.268576 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovn-rundir\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.268600 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6cbk\" (UniqueName: \"kubernetes.io/projected/e23503f0-7f00-4d2d-830b-fed7db6e6a08-kube-api-access-r6cbk\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.268616 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-combined-ca-bundle\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.337448 4789 generic.go:334] "Generic (PLEG): container finished" podID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerID="71816cb1e1dc225cc4c62c55a8064e75aa79bd2df0d364078f17dc7c513e994f" exitCode=0 Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.337538 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f699c71b-1e44-4a4d-b1fb-77ef105af03d","Type":"ContainerDied","Data":"71816cb1e1dc225cc4c62c55a8064e75aa79bd2df0d364078f17dc7c513e994f"} Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.375115 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovn-rundir\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.375456 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6cbk\" (UniqueName: \"kubernetes.io/projected/e23503f0-7f00-4d2d-830b-fed7db6e6a08-kube-api-access-r6cbk\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.375478 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-combined-ca-bundle\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.375558 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23503f0-7f00-4d2d-830b-fed7db6e6a08-config\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.375578 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovs-rundir\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.375597 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.376500 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovn-rundir\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.376556 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovs-rundir\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.378553 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"02b0fdb4-d395-4464-8250-4288ca50c8de","Type":"ContainerStarted","Data":"c2cd1af7c1523ef803d9a8d625c1a5e7c662281e5cab347517b54666ab0a97e0"} Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.380057 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.385542 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-combined-ca-bundle\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.385956 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.387190 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-kvmbq"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.412300 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23503f0-7f00-4d2d-830b-fed7db6e6a08-config\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.431473 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6cbk\" (UniqueName: \"kubernetes.io/projected/e23503f0-7f00-4d2d-830b-fed7db6e6a08-kube-api-access-r6cbk\") pod \"ovn-controller-metrics-ghcvz\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.460020 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7878659675-rz2rk"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.461255 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.463644 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7878659675-rz2rk"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.463907 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.870598127 podStartE2EDuration="27.463894792s" podCreationTimestamp="2025-12-16 07:10:18 +0000 UTC" firstStartedPulling="2025-12-16 07:10:40.525996383 +0000 UTC m=+1178.787884012" lastFinishedPulling="2025-12-16 07:10:43.119293048 +0000 UTC m=+1181.381180677" observedRunningTime="2025-12-16 07:10:45.450793623 +0000 UTC m=+1183.712681252" watchObservedRunningTime="2025-12-16 07:10:45.463894792 +0000 UTC m=+1183.725782431" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.465090 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.502332 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.582323 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-dns-svc\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.582400 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9hdc\" (UniqueName: \"kubernetes.io/projected/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-kube-api-access-g9hdc\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.582423 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.582446 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-config\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.683688 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-dns-svc\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.683944 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9hdc\" (UniqueName: \"kubernetes.io/projected/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-kube-api-access-g9hdc\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.683967 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.683994 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-config\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.684818 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-config\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.686071 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-dns-svc\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.687634 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-4lkfh"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.687806 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.724897 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9hdc\" (UniqueName: \"kubernetes.io/projected/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-kube-api-access-g9hdc\") pod \"dnsmasq-dns-7878659675-rz2rk\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.736306 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-ldr8c"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.737681 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.740449 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.773617 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-ldr8c"] Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.789736 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-config\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.789772 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hskt5\" (UniqueName: \"kubernetes.io/projected/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-kube-api-access-hskt5\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.789795 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.789825 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.789848 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-dns-svc\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.800063 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.890842 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.891203 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.891229 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-dns-svc\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.891329 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-config\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.891345 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hskt5\" (UniqueName: \"kubernetes.io/projected/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-kube-api-access-hskt5\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.891946 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.892118 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.892670 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-dns-svc\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.892734 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-config\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.912495 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hskt5\" (UniqueName: \"kubernetes.io/projected/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-kube-api-access-hskt5\") pod \"dnsmasq-dns-586b989cdc-ldr8c\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.915555 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.993072 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-config\") pod \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.993131 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvhkk\" (UniqueName: \"kubernetes.io/projected/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-kube-api-access-mvhkk\") pod \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.993156 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-dns-svc\") pod \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\" (UID: \"ae0e1a2e-99bd-4973-8482-685c1b9d2fee\") " Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.994019 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae0e1a2e-99bd-4973-8482-685c1b9d2fee" (UID: "ae0e1a2e-99bd-4973-8482-685c1b9d2fee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.994413 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-config" (OuterVolumeSpecName: "config") pod "ae0e1a2e-99bd-4973-8482-685c1b9d2fee" (UID: "ae0e1a2e-99bd-4973-8482-685c1b9d2fee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:45 crc kubenswrapper[4789]: I1216 07:10:45.997739 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-kube-api-access-mvhkk" (OuterVolumeSpecName: "kube-api-access-mvhkk") pod "ae0e1a2e-99bd-4973-8482-685c1b9d2fee" (UID: "ae0e1a2e-99bd-4973-8482-685c1b9d2fee"). InnerVolumeSpecName "kube-api-access-mvhkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.095649 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.095990 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvhkk\" (UniqueName: \"kubernetes.io/projected/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-kube-api-access-mvhkk\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.096006 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae0e1a2e-99bd-4973-8482-685c1b9d2fee-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.099533 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.101624 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.196729 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-dns-svc\") pod \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.196832 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5576\" (UniqueName: \"kubernetes.io/projected/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-kube-api-access-z5576\") pod \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.196886 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-config\") pod \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\" (UID: \"ec04a3ac-e0cd-491a-8af6-c2bbfaece281\") " Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.198551 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-config" (OuterVolumeSpecName: "config") pod "ec04a3ac-e0cd-491a-8af6-c2bbfaece281" (UID: "ec04a3ac-e0cd-491a-8af6-c2bbfaece281"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.199133 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec04a3ac-e0cd-491a-8af6-c2bbfaece281" (UID: "ec04a3ac-e0cd-491a-8af6-c2bbfaece281"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.201862 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-kube-api-access-z5576" (OuterVolumeSpecName: "kube-api-access-z5576") pod "ec04a3ac-e0cd-491a-8af6-c2bbfaece281" (UID: "ec04a3ac-e0cd-491a-8af6-c2bbfaece281"). InnerVolumeSpecName "kube-api-access-z5576". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.222855 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ghcvz"] Dec 16 07:10:46 crc kubenswrapper[4789]: W1216 07:10:46.247604 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode23503f0_7f00_4d2d_830b_fed7db6e6a08.slice/crio-1932411cb589fdeeffd4e760dc58986c5e9c987b080cd34d32d9f259a3620e50 WatchSource:0}: Error finding container 1932411cb589fdeeffd4e760dc58986c5e9c987b080cd34d32d9f259a3620e50: Status 404 returned error can't find the container with id 1932411cb589fdeeffd4e760dc58986c5e9c987b080cd34d32d9f259a3620e50 Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.299358 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.299416 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5576\" (UniqueName: \"kubernetes.io/projected/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-kube-api-access-z5576\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.299432 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec04a3ac-e0cd-491a-8af6-c2bbfaece281-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.334264 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7878659675-rz2rk"] Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.390543 4789 generic.go:334] "Generic (PLEG): container finished" podID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerID="d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5" exitCode=0 Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.390622 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d868c627-a661-4c69-afd7-26d88b2be0ec","Type":"ContainerDied","Data":"d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.392290 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9" event={"ID":"1e16a3ef-920e-493a-ae2f-7336d64bbd7e","Type":"ContainerStarted","Data":"523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.392528 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-cw7z9" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.394935 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" event={"ID":"ae0e1a2e-99bd-4973-8482-685c1b9d2fee","Type":"ContainerDied","Data":"b725897d3c919baaa39ff1937d91a1f935c96db914f4e8dbf442742490b949fc"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.395008 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-kvmbq" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.397874 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f699c71b-1e44-4a4d-b1fb-77ef105af03d","Type":"ContainerStarted","Data":"eddf93e3cc7bc715cd0565a58d08d71f72cabb8839a08286cf1924335b971dde"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.400121 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghcvz" event={"ID":"e23503f0-7f00-4d2d-830b-fed7db6e6a08","Type":"ContainerStarted","Data":"1932411cb589fdeeffd4e760dc58986c5e9c987b080cd34d32d9f259a3620e50"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.401173 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-rz2rk" event={"ID":"b50fd3ab-7d0c-438a-8de8-1001ab7209e3","Type":"ContainerStarted","Data":"35202d05e287dfe4d2731439b930690489e5a44116691f5485e0a8043a1b5a1c"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.402379 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" event={"ID":"ec04a3ac-e0cd-491a-8af6-c2bbfaece281","Type":"ContainerDied","Data":"f22a90872cb03df967e184a76a05de17f370ef5f2d425e1e334df3b15058b97c"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.402443 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7cbb8f79-4lkfh" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.404444 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2fceb99a-9dfd-4d79-a0fd-666390de4440","Type":"ContainerStarted","Data":"c12dcb355856793bb3ead4efde2d2c892f26c8b50ad22dd0a5a3468ce4f9c0a6"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.411148 4789 generic.go:334] "Generic (PLEG): container finished" podID="b5429404-d973-4580-961a-8ad6081e93ec" containerID="c46ade5ea8d28fc7bb2d2be5d8b1d0eb0b2a1ed94708b13e6fcd4a9538e9d515" exitCode=0 Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.411240 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tblns" event={"ID":"b5429404-d973-4580-961a-8ad6081e93ec","Type":"ContainerDied","Data":"c46ade5ea8d28fc7bb2d2be5d8b1d0eb0b2a1ed94708b13e6fcd4a9538e9d515"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.414070 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6456012f-c7be-458c-a9a5-b3958ae72c2c","Type":"ContainerStarted","Data":"bb2b0b90bf159eee0923971b8cd93d6fe6ac8b4ae7096de2af216bf8667a77a6"} Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.433183 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.233351229 podStartE2EDuration="31.433164828s" podCreationTimestamp="2025-12-16 07:10:15 +0000 UTC" firstStartedPulling="2025-12-16 07:10:17.353713735 +0000 UTC m=+1155.615601364" lastFinishedPulling="2025-12-16 07:10:40.553527334 +0000 UTC m=+1178.815414963" observedRunningTime="2025-12-16 07:10:46.431395375 +0000 UTC m=+1184.693283004" watchObservedRunningTime="2025-12-16 07:10:46.433164828 +0000 UTC m=+1184.695052457" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.468875 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-kvmbq"] Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.479857 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-kvmbq"] Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.492767 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-cw7z9" podStartSLOduration=21.870908076 podStartE2EDuration="25.49274846s" podCreationTimestamp="2025-12-16 07:10:21 +0000 UTC" firstStartedPulling="2025-12-16 07:10:41.291819552 +0000 UTC m=+1179.553707181" lastFinishedPulling="2025-12-16 07:10:44.913659936 +0000 UTC m=+1183.175547565" observedRunningTime="2025-12-16 07:10:46.490792432 +0000 UTC m=+1184.752680051" watchObservedRunningTime="2025-12-16 07:10:46.49274846 +0000 UTC m=+1184.754636089" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.581884 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-ldr8c"] Dec 16 07:10:46 crc kubenswrapper[4789]: W1216 07:10:46.581990 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode60fe9bd_5ec9_4c8c_a374_a506154afdbe.slice/crio-ad4601b7643d1e746f8e6eaf82b538be07f420d140c40119cbac563e43b1ea62 WatchSource:0}: Error finding container ad4601b7643d1e746f8e6eaf82b538be07f420d140c40119cbac563e43b1ea62: Status 404 returned error can't find the container with id ad4601b7643d1e746f8e6eaf82b538be07f420d140c40119cbac563e43b1ea62 Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.593974 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-4lkfh"] Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.602657 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c7cbb8f79-4lkfh"] Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.745629 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:46 crc kubenswrapper[4789]: I1216 07:10:46.745967 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:47.237397 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:47.424338 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" event={"ID":"e60fe9bd-5ec9-4c8c-a374-a506154afdbe","Type":"ContainerStarted","Data":"ad4601b7643d1e746f8e6eaf82b538be07f420d140c40119cbac563e43b1ea62"} Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.113842 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae0e1a2e-99bd-4973-8482-685c1b9d2fee" path="/var/lib/kubelet/pods/ae0e1a2e-99bd-4973-8482-685c1b9d2fee/volumes" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.114484 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec04a3ac-e0cd-491a-8af6-c2bbfaece281" path="/var/lib/kubelet/pods/ec04a3ac-e0cd-491a-8af6-c2bbfaece281/volumes" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.435073 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d868c627-a661-4c69-afd7-26d88b2be0ec","Type":"ContainerStarted","Data":"08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d"} Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.438142 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tblns" event={"ID":"b5429404-d973-4580-961a-8ad6081e93ec","Type":"ContainerStarted","Data":"4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee"} Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.438185 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tblns" event={"ID":"b5429404-d973-4580-961a-8ad6081e93ec","Type":"ContainerStarted","Data":"b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652"} Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.438671 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.464582 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223372002.390211 podStartE2EDuration="34.464564281s" podCreationTimestamp="2025-12-16 07:10:14 +0000 UTC" firstStartedPulling="2025-12-16 07:10:16.098112142 +0000 UTC m=+1154.359999771" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:10:48.457168112 +0000 UTC m=+1186.719055751" watchObservedRunningTime="2025-12-16 07:10:48.464564281 +0000 UTC m=+1186.726451910" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.481055 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tblns" podStartSLOduration=23.875977828 podStartE2EDuration="27.481031793s" podCreationTimestamp="2025-12-16 07:10:21 +0000 UTC" firstStartedPulling="2025-12-16 07:10:41.291416632 +0000 UTC m=+1179.553304261" lastFinishedPulling="2025-12-16 07:10:44.896470597 +0000 UTC m=+1183.158358226" observedRunningTime="2025-12-16 07:10:48.478105992 +0000 UTC m=+1186.739993641" watchObservedRunningTime="2025-12-16 07:10:48.481031793 +0000 UTC m=+1186.742919422" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.909855 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-ldr8c"] Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.955497 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-v84p9"] Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.964261 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:48 crc kubenswrapper[4789]: I1216 07:10:48.990559 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-v84p9"] Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.152696 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.152993 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-config\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.153054 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-645qz\" (UniqueName: \"kubernetes.io/projected/7766e284-61b1-4146-b6a7-e45e8eb1772d-kube-api-access-645qz\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.153149 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.153192 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.254950 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-config\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.255043 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-645qz\" (UniqueName: \"kubernetes.io/projected/7766e284-61b1-4146-b6a7-e45e8eb1772d-kube-api-access-645qz\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.255111 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.255136 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.255184 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.256047 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-config\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.256442 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.256508 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.256650 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.276933 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-645qz\" (UniqueName: \"kubernetes.io/projected/7766e284-61b1-4146-b6a7-e45e8eb1772d-kube-api-access-645qz\") pod \"dnsmasq-dns-67fdf7998c-v84p9\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.294547 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:49 crc kubenswrapper[4789]: I1216 07:10:49.444122 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.057561 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.064322 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.066612 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.069144 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.069358 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.069518 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-whtm2" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.087564 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.170430 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-cache\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.170476 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.170657 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4pkq\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-kube-api-access-h4pkq\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.170744 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.170776 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-lock\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.272474 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4pkq\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-kube-api-access-h4pkq\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.273414 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.273756 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-lock\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: E1216 07:10:50.273594 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:10:50 crc kubenswrapper[4789]: E1216 07:10:50.274943 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:10:50 crc kubenswrapper[4789]: E1216 07:10:50.275009 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift podName:cbd6bd33-5f98-4eb6-9fee-5080941ee4c0 nodeName:}" failed. No retries permitted until 2025-12-16 07:10:50.774980001 +0000 UTC m=+1189.036867630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift") pod "swift-storage-0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0") : configmap "swift-ring-files" not found Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.274502 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-cache\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.275222 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.275454 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-cache\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.274898 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-lock\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.276643 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.297272 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4pkq\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-kube-api-access-h4pkq\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.323101 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: E1216 07:10:50.442435 4789 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.46:47064->38.102.83.46:33871: write tcp 38.102.83.46:47064->38.102.83.46:33871: write: broken pipe Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.594277 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-v84p9"] Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.634651 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-zk74x"] Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.635658 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.638285 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.638907 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.639250 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.660057 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zk74x"] Dec 16 07:10:50 crc kubenswrapper[4789]: W1216 07:10:50.680841 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7766e284_61b1_4146_b6a7_e45e8eb1772d.slice/crio-d3fddf36060a0024ff9a939e135d11916914106e132b30c78776fc77c6c474a2 WatchSource:0}: Error finding container d3fddf36060a0024ff9a939e135d11916914106e132b30c78776fc77c6c474a2: Status 404 returned error can't find the container with id d3fddf36060a0024ff9a939e135d11916914106e132b30c78776fc77c6c474a2 Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.787762 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-dispersionconf\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.787807 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0a1b402-4791-402a-aa3e-b7f400007ac2-etc-swift\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.787955 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-ring-data-devices\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.788007 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.788028 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-scripts\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: E1216 07:10:50.788218 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:10:50 crc kubenswrapper[4789]: E1216 07:10:50.788232 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:10:50 crc kubenswrapper[4789]: E1216 07:10:50.788269 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift podName:cbd6bd33-5f98-4eb6-9fee-5080941ee4c0 nodeName:}" failed. No retries permitted until 2025-12-16 07:10:51.788255007 +0000 UTC m=+1190.050142726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift") pod "swift-storage-0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0") : configmap "swift-ring-files" not found Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.788597 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-combined-ca-bundle\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.788742 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhmh\" (UniqueName: \"kubernetes.io/projected/a0a1b402-4791-402a-aa3e-b7f400007ac2-kube-api-access-hwhmh\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.788766 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-swiftconf\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.889947 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0a1b402-4791-402a-aa3e-b7f400007ac2-etc-swift\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.890026 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-ring-data-devices\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.890056 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-scripts\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.890077 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-combined-ca-bundle\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.890142 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwhmh\" (UniqueName: \"kubernetes.io/projected/a0a1b402-4791-402a-aa3e-b7f400007ac2-kube-api-access-hwhmh\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.890159 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-swiftconf\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.890202 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-dispersionconf\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.890352 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0a1b402-4791-402a-aa3e-b7f400007ac2-etc-swift\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.891049 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-scripts\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.891314 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-ring-data-devices\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.893590 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-dispersionconf\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.893844 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-swiftconf\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.893900 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-combined-ca-bundle\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.908099 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwhmh\" (UniqueName: \"kubernetes.io/projected/a0a1b402-4791-402a-aa3e-b7f400007ac2-kube-api-access-hwhmh\") pod \"swift-ring-rebalance-zk74x\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:50 crc kubenswrapper[4789]: I1216 07:10:50.955591 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.404065 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zk74x"] Dec 16 07:10:51 crc kubenswrapper[4789]: W1216 07:10:51.412886 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a1b402_4791_402a_aa3e_b7f400007ac2.slice/crio-23d7efc16b36291185c1d2ab0e1197a02d4dd26b8e2d758a415937de4620905d WatchSource:0}: Error finding container 23d7efc16b36291185c1d2ab0e1197a02d4dd26b8e2d758a415937de4620905d: Status 404 returned error can't find the container with id 23d7efc16b36291185c1d2ab0e1197a02d4dd26b8e2d758a415937de4620905d Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.503929 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghcvz" event={"ID":"e23503f0-7f00-4d2d-830b-fed7db6e6a08","Type":"ContainerStarted","Data":"a7f4f9abcbcd0342850b2b57ff633a5bfa01b4b748c0160240e6025712e3081c"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.508360 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2fceb99a-9dfd-4d79-a0fd-666390de4440","Type":"ContainerStarted","Data":"36135c2133321ad8536765d85522a5abfcb2970720fa91440550ac33b7490f25"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.512023 4789 generic.go:334] "Generic (PLEG): container finished" podID="e60fe9bd-5ec9-4c8c-a374-a506154afdbe" containerID="f4f4452bbd42926a83747f10fd9baeb7b9b17558aa39d13ce3762178e610cd0e" exitCode=0 Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.512065 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" event={"ID":"e60fe9bd-5ec9-4c8c-a374-a506154afdbe","Type":"ContainerDied","Data":"f4f4452bbd42926a83747f10fd9baeb7b9b17558aa39d13ce3762178e610cd0e"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.516319 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zk74x" event={"ID":"a0a1b402-4791-402a-aa3e-b7f400007ac2","Type":"ContainerStarted","Data":"23d7efc16b36291185c1d2ab0e1197a02d4dd26b8e2d758a415937de4620905d"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.526224 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6456012f-c7be-458c-a9a5-b3958ae72c2c","Type":"ContainerStarted","Data":"89655488a9eed182dbfe9d3dcd61937c370e55101b1d14bd72ae2d0e119b37e0"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.544946 4789 generic.go:334] "Generic (PLEG): container finished" podID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerID="bcd8c8c8b5782cf9fcec406ad1d2a5af94990dd945cb3cc1e0ebd4319d213c4f" exitCode=0 Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.547762 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-rz2rk" event={"ID":"b50fd3ab-7d0c-438a-8de8-1001ab7209e3","Type":"ContainerDied","Data":"bcd8c8c8b5782cf9fcec406ad1d2a5af94990dd945cb3cc1e0ebd4319d213c4f"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.560334 4789 generic.go:334] "Generic (PLEG): container finished" podID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerID="dd6706f3e45c930b5e7382399b5ad642b49d3db8f65af9441ec6627714a15acb" exitCode=0 Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.560435 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" event={"ID":"7766e284-61b1-4146-b6a7-e45e8eb1772d","Type":"ContainerDied","Data":"dd6706f3e45c930b5e7382399b5ad642b49d3db8f65af9441ec6627714a15acb"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.560473 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" event={"ID":"7766e284-61b1-4146-b6a7-e45e8eb1772d","Type":"ContainerStarted","Data":"d3fddf36060a0024ff9a939e135d11916914106e132b30c78776fc77c6c474a2"} Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.565439 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ghcvz" podStartSLOduration=2.62060552 podStartE2EDuration="6.565417272s" podCreationTimestamp="2025-12-16 07:10:45 +0000 UTC" firstStartedPulling="2025-12-16 07:10:46.250135279 +0000 UTC m=+1184.512022908" lastFinishedPulling="2025-12-16 07:10:50.194947031 +0000 UTC m=+1188.456834660" observedRunningTime="2025-12-16 07:10:51.525601782 +0000 UTC m=+1189.787489411" watchObservedRunningTime="2025-12-16 07:10:51.565417272 +0000 UTC m=+1189.827304901" Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.569475 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.638461603 podStartE2EDuration="27.56945365s" podCreationTimestamp="2025-12-16 07:10:24 +0000 UTC" firstStartedPulling="2025-12-16 07:10:41.293566955 +0000 UTC m=+1179.555454584" lastFinishedPulling="2025-12-16 07:10:50.224559002 +0000 UTC m=+1188.486446631" observedRunningTime="2025-12-16 07:10:51.554228499 +0000 UTC m=+1189.816116128" watchObservedRunningTime="2025-12-16 07:10:51.56945365 +0000 UTC m=+1189.831341279" Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.594052 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.535560024 podStartE2EDuration="27.594035399s" podCreationTimestamp="2025-12-16 07:10:24 +0000 UTC" firstStartedPulling="2025-12-16 07:10:41.171208553 +0000 UTC m=+1179.433096182" lastFinishedPulling="2025-12-16 07:10:50.229683928 +0000 UTC m=+1188.491571557" observedRunningTime="2025-12-16 07:10:51.59243975 +0000 UTC m=+1189.854327409" watchObservedRunningTime="2025-12-16 07:10:51.594035399 +0000 UTC m=+1189.855923028" Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.816420 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:51 crc kubenswrapper[4789]: E1216 07:10:51.816597 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:10:51 crc kubenswrapper[4789]: E1216 07:10:51.816613 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:10:51 crc kubenswrapper[4789]: E1216 07:10:51.816659 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift podName:cbd6bd33-5f98-4eb6-9fee-5080941ee4c0 nodeName:}" failed. No retries permitted until 2025-12-16 07:10:53.816645142 +0000 UTC m=+1192.078532771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift") pod "swift-storage-0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0") : configmap "swift-ring-files" not found Dec 16 07:10:51 crc kubenswrapper[4789]: E1216 07:10:51.818605 4789 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 16 07:10:51 crc kubenswrapper[4789]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b50fd3ab-7d0c-438a-8de8-1001ab7209e3/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 16 07:10:51 crc kubenswrapper[4789]: > podSandboxID="35202d05e287dfe4d2731439b930690489e5a44116691f5485e0a8043a1b5a1c" Dec 16 07:10:51 crc kubenswrapper[4789]: E1216 07:10:51.818779 4789 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 16 07:10:51 crc kubenswrapper[4789]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bfh5d7h8hd8h664h564hfbh5d4h5f5h55h5fch66h675hb8h65bh64dhbh5dchc9h66fh5dbhf4h658h64ch55bhbh65h55dh597h68dh579hbdq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g9hdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7878659675-rz2rk_openstack(b50fd3ab-7d0c-438a-8de8-1001ab7209e3): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b50fd3ab-7d0c-438a-8de8-1001ab7209e3/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 16 07:10:51 crc kubenswrapper[4789]: > logger="UnhandledError" Dec 16 07:10:51 crc kubenswrapper[4789]: E1216 07:10:51.819975 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b50fd3ab-7d0c-438a-8de8-1001ab7209e3/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7878659675-rz2rk" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" Dec 16 07:10:51 crc kubenswrapper[4789]: I1216 07:10:51.851465 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.019151 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-sb\") pod \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.019323 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-dns-svc\") pod \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.019400 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-config\") pod \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.020180 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hskt5\" (UniqueName: \"kubernetes.io/projected/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-kube-api-access-hskt5\") pod \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.020763 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-nb\") pod \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\" (UID: \"e60fe9bd-5ec9-4c8c-a374-a506154afdbe\") " Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.024108 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-kube-api-access-hskt5" (OuterVolumeSpecName: "kube-api-access-hskt5") pod "e60fe9bd-5ec9-4c8c-a374-a506154afdbe" (UID: "e60fe9bd-5ec9-4c8c-a374-a506154afdbe"). InnerVolumeSpecName "kube-api-access-hskt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.039709 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e60fe9bd-5ec9-4c8c-a374-a506154afdbe" (UID: "e60fe9bd-5ec9-4c8c-a374-a506154afdbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.039761 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-config" (OuterVolumeSpecName: "config") pod "e60fe9bd-5ec9-4c8c-a374-a506154afdbe" (UID: "e60fe9bd-5ec9-4c8c-a374-a506154afdbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.046087 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e60fe9bd-5ec9-4c8c-a374-a506154afdbe" (UID: "e60fe9bd-5ec9-4c8c-a374-a506154afdbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.047007 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e60fe9bd-5ec9-4c8c-a374-a506154afdbe" (UID: "e60fe9bd-5ec9-4c8c-a374-a506154afdbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.124996 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.125059 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.125069 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.125079 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hskt5\" (UniqueName: \"kubernetes.io/projected/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-kube-api-access-hskt5\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.125108 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e60fe9bd-5ec9-4c8c-a374-a506154afdbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:52 crc kubenswrapper[4789]: E1216 07:10:52.199184 4789 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.46:47078->38.102.83.46:33871: write tcp 38.102.83.46:47078->38.102.83.46:33871: write: broken pipe Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.568585 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" event={"ID":"7766e284-61b1-4146-b6a7-e45e8eb1772d","Type":"ContainerStarted","Data":"88365ed41bbf83111859483b5a0e5bb3068071855ecbc6908269bc8d5048fcce"} Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.569948 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.572290 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" event={"ID":"e60fe9bd-5ec9-4c8c-a374-a506154afdbe","Type":"ContainerDied","Data":"ad4601b7643d1e746f8e6eaf82b538be07f420d140c40119cbac563e43b1ea62"} Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.572348 4789 scope.go:117] "RemoveContainer" containerID="f4f4452bbd42926a83747f10fd9baeb7b9b17558aa39d13ce3762178e610cd0e" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.572787 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-ldr8c" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.616641 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" podStartSLOduration=4.6166220639999995 podStartE2EDuration="4.616622064s" podCreationTimestamp="2025-12-16 07:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:10:52.597561019 +0000 UTC m=+1190.859448658" watchObservedRunningTime="2025-12-16 07:10:52.616622064 +0000 UTC m=+1190.878509693" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.664037 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-ldr8c"] Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.674057 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-ldr8c"] Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.839843 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.876980 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.891719 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:52 crc kubenswrapper[4789]: I1216 07:10:52.964924 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.040771 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.080469 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.578292 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.578336 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.619888 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.625677 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.858583 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:53 crc kubenswrapper[4789]: E1216 07:10:53.860108 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:10:53 crc kubenswrapper[4789]: E1216 07:10:53.861892 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:10:53 crc kubenswrapper[4789]: E1216 07:10:53.861968 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift podName:cbd6bd33-5f98-4eb6-9fee-5080941ee4c0 nodeName:}" failed. No retries permitted until 2025-12-16 07:10:57.861946305 +0000 UTC m=+1196.123833934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift") pod "swift-storage-0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0") : configmap "swift-ring-files" not found Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.885069 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:10:53 crc kubenswrapper[4789]: E1216 07:10:53.885454 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60fe9bd-5ec9-4c8c-a374-a506154afdbe" containerName="init" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.885475 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60fe9bd-5ec9-4c8c-a374-a506154afdbe" containerName="init" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.885714 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60fe9bd-5ec9-4c8c-a374-a506154afdbe" containerName="init" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.886595 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.889715 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-874xg" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.890003 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.892491 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.892524 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.899960 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.963961 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.964101 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp9kb\" (UniqueName: \"kubernetes.io/projected/63f88379-6b15-47a6-bf24-7cf0b3edc56a-kube-api-access-dp9kb\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.964194 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-scripts\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.964651 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.964802 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-config\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.964866 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:53 crc kubenswrapper[4789]: I1216 07:10:53.965001 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.066433 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.066478 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-config\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.066504 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.066545 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.066567 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.066604 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp9kb\" (UniqueName: \"kubernetes.io/projected/63f88379-6b15-47a6-bf24-7cf0b3edc56a-kube-api-access-dp9kb\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.066659 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-scripts\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.067539 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.067869 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-config\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.067937 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-scripts\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.073239 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.073477 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.074193 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.083188 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp9kb\" (UniqueName: \"kubernetes.io/projected/63f88379-6b15-47a6-bf24-7cf0b3edc56a-kube-api-access-dp9kb\") pod \"ovn-northd-0\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " pod="openstack/ovn-northd-0" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.115946 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e60fe9bd-5ec9-4c8c-a374-a506154afdbe" path="/var/lib/kubelet/pods/e60fe9bd-5ec9-4c8c-a374-a506154afdbe/volumes" Dec 16 07:10:54 crc kubenswrapper[4789]: I1216 07:10:54.206798 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.078138 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:10:55 crc kubenswrapper[4789]: W1216 07:10:55.079586 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63f88379_6b15_47a6_bf24_7cf0b3edc56a.slice/crio-e341a1ba4af9cf60c5f85558bfe8ebe06fb1fae67a426bf30ca491dc31891589 WatchSource:0}: Error finding container e341a1ba4af9cf60c5f85558bfe8ebe06fb1fae67a426bf30ca491dc31891589: Status 404 returned error can't find the container with id e341a1ba4af9cf60c5f85558bfe8ebe06fb1fae67a426bf30ca491dc31891589 Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.431448 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.431944 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.518657 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.593181 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"63f88379-6b15-47a6-bf24-7cf0b3edc56a","Type":"ContainerStarted","Data":"e341a1ba4af9cf60c5f85558bfe8ebe06fb1fae67a426bf30ca491dc31891589"} Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.595592 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-rz2rk" event={"ID":"b50fd3ab-7d0c-438a-8de8-1001ab7209e3","Type":"ContainerStarted","Data":"fbf84d0254bdb3ae7e747d4a64397daa71dbc59791575221a584d6dab935ba96"} Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.595859 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.608077 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zk74x" event={"ID":"a0a1b402-4791-402a-aa3e-b7f400007ac2","Type":"ContainerStarted","Data":"6b9b43486b394346d4ab06609223e399c69302b8c06ad02c18fe02f7f5d8d2d8"} Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.628341 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7878659675-rz2rk" podStartSLOduration=6.756527169 podStartE2EDuration="10.628311932s" podCreationTimestamp="2025-12-16 07:10:45 +0000 UTC" firstStartedPulling="2025-12-16 07:10:46.353830655 +0000 UTC m=+1184.615718274" lastFinishedPulling="2025-12-16 07:10:50.225615408 +0000 UTC m=+1188.487503037" observedRunningTime="2025-12-16 07:10:55.622411378 +0000 UTC m=+1193.884299017" watchObservedRunningTime="2025-12-16 07:10:55.628311932 +0000 UTC m=+1193.890199561" Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.647782 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-zk74x" podStartSLOduration=2.342602508 podStartE2EDuration="5.647758776s" podCreationTimestamp="2025-12-16 07:10:50 +0000 UTC" firstStartedPulling="2025-12-16 07:10:51.416076883 +0000 UTC m=+1189.677964512" lastFinishedPulling="2025-12-16 07:10:54.721233151 +0000 UTC m=+1192.983120780" observedRunningTime="2025-12-16 07:10:55.644123037 +0000 UTC m=+1193.906010676" watchObservedRunningTime="2025-12-16 07:10:55.647758776 +0000 UTC m=+1193.909646395" Dec 16 07:10:55 crc kubenswrapper[4789]: I1216 07:10:55.687096 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.012741 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c869-account-create-update-dwt2f"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.015320 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.026963 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.028191 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c869-account-create-update-dwt2f"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.110257 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-sm8ns"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.111336 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.122176 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sm8ns"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.125551 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-operator-scripts\") pod \"keystone-c869-account-create-update-dwt2f\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.125646 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzmd4\" (UniqueName: \"kubernetes.io/projected/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-kube-api-access-wzmd4\") pod \"keystone-c869-account-create-update-dwt2f\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.197901 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-n2j8s"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.199159 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.208095 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n2j8s"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.226890 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzmd4\" (UniqueName: \"kubernetes.io/projected/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-kube-api-access-wzmd4\") pod \"keystone-c869-account-create-update-dwt2f\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.227006 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-operator-scripts\") pod \"keystone-db-create-sm8ns\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.227157 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-operator-scripts\") pod \"keystone-c869-account-create-update-dwt2f\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.227192 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thlk9\" (UniqueName: \"kubernetes.io/projected/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-kube-api-access-thlk9\") pod \"keystone-db-create-sm8ns\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.227797 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-operator-scripts\") pod \"keystone-c869-account-create-update-dwt2f\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.245694 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzmd4\" (UniqueName: \"kubernetes.io/projected/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-kube-api-access-wzmd4\") pod \"keystone-c869-account-create-update-dwt2f\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.317746 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-dcc5-account-create-update-lx5zl"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.318957 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.321039 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.327346 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dcc5-account-create-update-lx5zl"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.328175 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d1ec63-1e07-4430-84f6-6f356d6cb420-operator-scripts\") pod \"placement-db-create-n2j8s\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.328256 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-operator-scripts\") pod \"keystone-db-create-sm8ns\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.328316 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzgf\" (UniqueName: \"kubernetes.io/projected/16d1ec63-1e07-4430-84f6-6f356d6cb420-kube-api-access-grzgf\") pod \"placement-db-create-n2j8s\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.328386 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thlk9\" (UniqueName: \"kubernetes.io/projected/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-kube-api-access-thlk9\") pod \"keystone-db-create-sm8ns\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.330825 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-operator-scripts\") pod \"keystone-db-create-sm8ns\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.359818 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thlk9\" (UniqueName: \"kubernetes.io/projected/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-kube-api-access-thlk9\") pod \"keystone-db-create-sm8ns\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.395281 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.397886 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-cq5lc"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.398807 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.409796 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cq5lc"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.429507 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzgf\" (UniqueName: \"kubernetes.io/projected/16d1ec63-1e07-4430-84f6-6f356d6cb420-kube-api-access-grzgf\") pod \"placement-db-create-n2j8s\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.429585 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9w89\" (UniqueName: \"kubernetes.io/projected/d4d78f46-553d-47a0-a433-445b66500e1c-kube-api-access-c9w89\") pod \"placement-dcc5-account-create-update-lx5zl\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.429642 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d1ec63-1e07-4430-84f6-6f356d6cb420-operator-scripts\") pod \"placement-db-create-n2j8s\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.429699 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d78f46-553d-47a0-a433-445b66500e1c-operator-scripts\") pod \"placement-dcc5-account-create-update-lx5zl\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.430634 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d1ec63-1e07-4430-84f6-6f356d6cb420-operator-scripts\") pod \"placement-db-create-n2j8s\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.437985 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sm8ns" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.444118 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzgf\" (UniqueName: \"kubernetes.io/projected/16d1ec63-1e07-4430-84f6-6f356d6cb420-kube-api-access-grzgf\") pod \"placement-db-create-n2j8s\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.501328 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9ba2-account-create-update-2225k"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.502368 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.505105 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.523317 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2j8s" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.542979 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9ba2-account-create-update-2225k"] Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.544231 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r4t7\" (UniqueName: \"kubernetes.io/projected/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-kube-api-access-2r4t7\") pod \"glance-db-create-cq5lc\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.544325 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d78f46-553d-47a0-a433-445b66500e1c-operator-scripts\") pod \"placement-dcc5-account-create-update-lx5zl\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.544402 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-operator-scripts\") pod \"glance-db-create-cq5lc\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.544490 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9w89\" (UniqueName: \"kubernetes.io/projected/d4d78f46-553d-47a0-a433-445b66500e1c-kube-api-access-c9w89\") pod \"placement-dcc5-account-create-update-lx5zl\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.545441 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d78f46-553d-47a0-a433-445b66500e1c-operator-scripts\") pod \"placement-dcc5-account-create-update-lx5zl\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.589578 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9w89\" (UniqueName: \"kubernetes.io/projected/d4d78f46-553d-47a0-a433-445b66500e1c-kube-api-access-c9w89\") pod \"placement-dcc5-account-create-update-lx5zl\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.636724 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.645691 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/980cd36a-7926-48a8-9749-559317eeee7f-operator-scripts\") pod \"glance-9ba2-account-create-update-2225k\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.645736 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dczw5\" (UniqueName: \"kubernetes.io/projected/980cd36a-7926-48a8-9749-559317eeee7f-kube-api-access-dczw5\") pod \"glance-9ba2-account-create-update-2225k\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.645801 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r4t7\" (UniqueName: \"kubernetes.io/projected/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-kube-api-access-2r4t7\") pod \"glance-db-create-cq5lc\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.645869 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-operator-scripts\") pod \"glance-db-create-cq5lc\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.646721 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-operator-scripts\") pod \"glance-db-create-cq5lc\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.682164 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r4t7\" (UniqueName: \"kubernetes.io/projected/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-kube-api-access-2r4t7\") pod \"glance-db-create-cq5lc\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.713415 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cq5lc" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.748340 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/980cd36a-7926-48a8-9749-559317eeee7f-operator-scripts\") pod \"glance-9ba2-account-create-update-2225k\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.748385 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dczw5\" (UniqueName: \"kubernetes.io/projected/980cd36a-7926-48a8-9749-559317eeee7f-kube-api-access-dczw5\") pod \"glance-9ba2-account-create-update-2225k\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.749213 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/980cd36a-7926-48a8-9749-559317eeee7f-operator-scripts\") pod \"glance-9ba2-account-create-update-2225k\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.763690 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dczw5\" (UniqueName: \"kubernetes.io/projected/980cd36a-7926-48a8-9749-559317eeee7f-kube-api-access-dczw5\") pod \"glance-9ba2-account-create-update-2225k\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.872386 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:10:57 crc kubenswrapper[4789]: I1216 07:10:57.953065 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:10:57 crc kubenswrapper[4789]: E1216 07:10:57.953544 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:10:57 crc kubenswrapper[4789]: E1216 07:10:57.953557 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:10:57 crc kubenswrapper[4789]: E1216 07:10:57.953599 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift podName:cbd6bd33-5f98-4eb6-9fee-5080941ee4c0 nodeName:}" failed. No retries permitted until 2025-12-16 07:11:05.953584235 +0000 UTC m=+1204.215471864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift") pod "swift-storage-0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0") : configmap "swift-ring-files" not found Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.403193 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c869-account-create-update-dwt2f"] Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.410345 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dcc5-account-create-update-lx5zl"] Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.436155 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sm8ns"] Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.446704 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n2j8s"] Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.562220 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cq5lc"] Dec 16 07:10:58 crc kubenswrapper[4789]: W1216 07:10:58.564954 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef67dda_a2a4_4ad0_99e3_3b918fdaca0d.slice/crio-c766f2e7e9b137c02840f8bc114babce75be46451b44f2bb5b9e05d0f74bf459 WatchSource:0}: Error finding container c766f2e7e9b137c02840f8bc114babce75be46451b44f2bb5b9e05d0f74bf459: Status 404 returned error can't find the container with id c766f2e7e9b137c02840f8bc114babce75be46451b44f2bb5b9e05d0f74bf459 Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.628478 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9ba2-account-create-update-2225k"] Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.631829 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c869-account-create-update-dwt2f" event={"ID":"e20ac101-1bcb-4ca7-8a77-e827c5eb6383","Type":"ContainerStarted","Data":"a22c22690888ce152129c479868f590c078360f0f1973e2d66016c8cfc98eded"} Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.632874 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2j8s" event={"ID":"16d1ec63-1e07-4430-84f6-6f356d6cb420","Type":"ContainerStarted","Data":"1637ac2230fbe66a7ab91718178c98df73dc453879e491e567d12d62b06c7769"} Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.634342 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cq5lc" event={"ID":"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d","Type":"ContainerStarted","Data":"c766f2e7e9b137c02840f8bc114babce75be46451b44f2bb5b9e05d0f74bf459"} Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.636899 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"63f88379-6b15-47a6-bf24-7cf0b3edc56a","Type":"ContainerStarted","Data":"c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985"} Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.636946 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"63f88379-6b15-47a6-bf24-7cf0b3edc56a","Type":"ContainerStarted","Data":"81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431"} Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.637213 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.638305 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dcc5-account-create-update-lx5zl" event={"ID":"d4d78f46-553d-47a0-a433-445b66500e1c","Type":"ContainerStarted","Data":"7ea22bcee73b87eb6d3b70ae5db0c86fb86da51f3d2f5902c1f1c32049495ad0"} Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.639773 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sm8ns" event={"ID":"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0","Type":"ContainerStarted","Data":"88f8d2c5b68c0a11cf6393996c89b58d529d5a3bafd1cfa475a8108537ec53f3"} Dec 16 07:10:58 crc kubenswrapper[4789]: W1216 07:10:58.644108 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod980cd36a_7926_48a8_9749_559317eeee7f.slice/crio-1460a2a0861accd472952afd99dadc76404586ae3335130010d03ef1574b60cc WatchSource:0}: Error finding container 1460a2a0861accd472952afd99dadc76404586ae3335130010d03ef1574b60cc: Status 404 returned error can't find the container with id 1460a2a0861accd472952afd99dadc76404586ae3335130010d03ef1574b60cc Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.663863 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.906435998 podStartE2EDuration="5.66384434s" podCreationTimestamp="2025-12-16 07:10:53 +0000 UTC" firstStartedPulling="2025-12-16 07:10:55.082731989 +0000 UTC m=+1193.344619628" lastFinishedPulling="2025-12-16 07:10:57.840140351 +0000 UTC m=+1196.102027970" observedRunningTime="2025-12-16 07:10:58.655282682 +0000 UTC m=+1196.917170321" watchObservedRunningTime="2025-12-16 07:10:58.66384434 +0000 UTC m=+1196.925731969" Dec 16 07:10:58 crc kubenswrapper[4789]: I1216 07:10:58.877392 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.297186 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.355735 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7878659675-rz2rk"] Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.355967 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7878659675-rz2rk" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerName="dnsmasq-dns" containerID="cri-o://fbf84d0254bdb3ae7e747d4a64397daa71dbc59791575221a584d6dab935ba96" gracePeriod=10 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.357687 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.665259 4789 generic.go:334] "Generic (PLEG): container finished" podID="980cd36a-7926-48a8-9749-559317eeee7f" containerID="3f6b0abba557f232b48bdb5ec4645df3b1b38170d8d6f2440c380ed6a141dd6d" exitCode=0 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.665505 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9ba2-account-create-update-2225k" event={"ID":"980cd36a-7926-48a8-9749-559317eeee7f","Type":"ContainerDied","Data":"3f6b0abba557f232b48bdb5ec4645df3b1b38170d8d6f2440c380ed6a141dd6d"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.665536 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9ba2-account-create-update-2225k" event={"ID":"980cd36a-7926-48a8-9749-559317eeee7f","Type":"ContainerStarted","Data":"1460a2a0861accd472952afd99dadc76404586ae3335130010d03ef1574b60cc"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.666653 4789 generic.go:334] "Generic (PLEG): container finished" podID="9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d" containerID="f7818360dd745ad4d1a1a3f20422491e4c611bad12433d1f211fc60b950a17fb" exitCode=0 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.666693 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cq5lc" event={"ID":"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d","Type":"ContainerDied","Data":"f7818360dd745ad4d1a1a3f20422491e4c611bad12433d1f211fc60b950a17fb"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.668299 4789 generic.go:334] "Generic (PLEG): container finished" podID="d4d78f46-553d-47a0-a433-445b66500e1c" containerID="ab861ee372290f4f8fec30aa7a49246b3ed73a01f2cae71985e21abf394dedd1" exitCode=0 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.668368 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dcc5-account-create-update-lx5zl" event={"ID":"d4d78f46-553d-47a0-a433-445b66500e1c","Type":"ContainerDied","Data":"ab861ee372290f4f8fec30aa7a49246b3ed73a01f2cae71985e21abf394dedd1"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.670107 4789 generic.go:334] "Generic (PLEG): container finished" podID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerID="fbf84d0254bdb3ae7e747d4a64397daa71dbc59791575221a584d6dab935ba96" exitCode=0 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.670147 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-rz2rk" event={"ID":"b50fd3ab-7d0c-438a-8de8-1001ab7209e3","Type":"ContainerDied","Data":"fbf84d0254bdb3ae7e747d4a64397daa71dbc59791575221a584d6dab935ba96"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.671432 4789 generic.go:334] "Generic (PLEG): container finished" podID="2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0" containerID="52cc706873c97fe5550bec5c9f9177edfc302bf236179eb223b3d953638f9aaf" exitCode=0 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.671462 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sm8ns" event={"ID":"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0","Type":"ContainerDied","Data":"52cc706873c97fe5550bec5c9f9177edfc302bf236179eb223b3d953638f9aaf"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.672784 4789 generic.go:334] "Generic (PLEG): container finished" podID="e20ac101-1bcb-4ca7-8a77-e827c5eb6383" containerID="2a59947be685694967d3e633218d3dd126cc65e9f818fd0449696960e2ac3ffd" exitCode=0 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.672820 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c869-account-create-update-dwt2f" event={"ID":"e20ac101-1bcb-4ca7-8a77-e827c5eb6383","Type":"ContainerDied","Data":"2a59947be685694967d3e633218d3dd126cc65e9f818fd0449696960e2ac3ffd"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.673796 4789 generic.go:334] "Generic (PLEG): container finished" podID="16d1ec63-1e07-4430-84f6-6f356d6cb420" containerID="2968db79927abd4bde0e18043eeec75e605baf4886fb111fa203e77c24e8aa6d" exitCode=0 Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.674722 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2j8s" event={"ID":"16d1ec63-1e07-4430-84f6-6f356d6cb420","Type":"ContainerDied","Data":"2968db79927abd4bde0e18043eeec75e605baf4886fb111fa203e77c24e8aa6d"} Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.826398 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.891367 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-config\") pod \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.891489 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-ovsdbserver-nb\") pod \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.891557 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-dns-svc\") pod \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.891588 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9hdc\" (UniqueName: \"kubernetes.io/projected/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-kube-api-access-g9hdc\") pod \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\" (UID: \"b50fd3ab-7d0c-438a-8de8-1001ab7209e3\") " Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.897820 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-kube-api-access-g9hdc" (OuterVolumeSpecName: "kube-api-access-g9hdc") pod "b50fd3ab-7d0c-438a-8de8-1001ab7209e3" (UID: "b50fd3ab-7d0c-438a-8de8-1001ab7209e3"). InnerVolumeSpecName "kube-api-access-g9hdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.929233 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b50fd3ab-7d0c-438a-8de8-1001ab7209e3" (UID: "b50fd3ab-7d0c-438a-8de8-1001ab7209e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.929245 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b50fd3ab-7d0c-438a-8de8-1001ab7209e3" (UID: "b50fd3ab-7d0c-438a-8de8-1001ab7209e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.929807 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-config" (OuterVolumeSpecName: "config") pod "b50fd3ab-7d0c-438a-8de8-1001ab7209e3" (UID: "b50fd3ab-7d0c-438a-8de8-1001ab7209e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.994071 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.994110 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.994121 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9hdc\" (UniqueName: \"kubernetes.io/projected/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-kube-api-access-g9hdc\") on node \"crc\" DevicePath \"\"" Dec 16 07:10:59 crc kubenswrapper[4789]: I1216 07:10:59.994132 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fd3ab-7d0c-438a-8de8-1001ab7209e3-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:00 crc kubenswrapper[4789]: I1216 07:11:00.682716 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7878659675-rz2rk" Dec 16 07:11:00 crc kubenswrapper[4789]: I1216 07:11:00.683490 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7878659675-rz2rk" event={"ID":"b50fd3ab-7d0c-438a-8de8-1001ab7209e3","Type":"ContainerDied","Data":"35202d05e287dfe4d2731439b930690489e5a44116691f5485e0a8043a1b5a1c"} Dec 16 07:11:00 crc kubenswrapper[4789]: I1216 07:11:00.683526 4789 scope.go:117] "RemoveContainer" containerID="fbf84d0254bdb3ae7e747d4a64397daa71dbc59791575221a584d6dab935ba96" Dec 16 07:11:00 crc kubenswrapper[4789]: I1216 07:11:00.711669 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7878659675-rz2rk"] Dec 16 07:11:00 crc kubenswrapper[4789]: I1216 07:11:00.712559 4789 scope.go:117] "RemoveContainer" containerID="bcd8c8c8b5782cf9fcec406ad1d2a5af94990dd945cb3cc1e0ebd4319d213c4f" Dec 16 07:11:00 crc kubenswrapper[4789]: I1216 07:11:00.724964 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7878659675-rz2rk"] Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.047551 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2j8s" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.120305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzgf\" (UniqueName: \"kubernetes.io/projected/16d1ec63-1e07-4430-84f6-6f356d6cb420-kube-api-access-grzgf\") pod \"16d1ec63-1e07-4430-84f6-6f356d6cb420\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.120428 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d1ec63-1e07-4430-84f6-6f356d6cb420-operator-scripts\") pod \"16d1ec63-1e07-4430-84f6-6f356d6cb420\" (UID: \"16d1ec63-1e07-4430-84f6-6f356d6cb420\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.121298 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d1ec63-1e07-4430-84f6-6f356d6cb420-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16d1ec63-1e07-4430-84f6-6f356d6cb420" (UID: "16d1ec63-1e07-4430-84f6-6f356d6cb420"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.125277 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d1ec63-1e07-4430-84f6-6f356d6cb420-kube-api-access-grzgf" (OuterVolumeSpecName: "kube-api-access-grzgf") pod "16d1ec63-1e07-4430-84f6-6f356d6cb420" (UID: "16d1ec63-1e07-4430-84f6-6f356d6cb420"). InnerVolumeSpecName "kube-api-access-grzgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.205584 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.215154 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sm8ns" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.226304 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grzgf\" (UniqueName: \"kubernetes.io/projected/16d1ec63-1e07-4430-84f6-6f356d6cb420-kube-api-access-grzgf\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.226342 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d1ec63-1e07-4430-84f6-6f356d6cb420-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.228516 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.243231 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.254149 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cq5lc" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.326900 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-operator-scripts\") pod \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.326993 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thlk9\" (UniqueName: \"kubernetes.io/projected/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-kube-api-access-thlk9\") pod \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327073 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-operator-scripts\") pod \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327103 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-operator-scripts\") pod \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\" (UID: \"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327169 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9w89\" (UniqueName: \"kubernetes.io/projected/d4d78f46-553d-47a0-a433-445b66500e1c-kube-api-access-c9w89\") pod \"d4d78f46-553d-47a0-a433-445b66500e1c\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327196 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d78f46-553d-47a0-a433-445b66500e1c-operator-scripts\") pod \"d4d78f46-553d-47a0-a433-445b66500e1c\" (UID: \"d4d78f46-553d-47a0-a433-445b66500e1c\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327248 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dczw5\" (UniqueName: \"kubernetes.io/projected/980cd36a-7926-48a8-9749-559317eeee7f-kube-api-access-dczw5\") pod \"980cd36a-7926-48a8-9749-559317eeee7f\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327275 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzmd4\" (UniqueName: \"kubernetes.io/projected/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-kube-api-access-wzmd4\") pod \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\" (UID: \"e20ac101-1bcb-4ca7-8a77-e827c5eb6383\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327335 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r4t7\" (UniqueName: \"kubernetes.io/projected/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-kube-api-access-2r4t7\") pod \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\" (UID: \"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327368 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/980cd36a-7926-48a8-9749-559317eeee7f-operator-scripts\") pod \"980cd36a-7926-48a8-9749-559317eeee7f\" (UID: \"980cd36a-7926-48a8-9749-559317eeee7f\") " Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327478 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e20ac101-1bcb-4ca7-8a77-e827c5eb6383" (UID: "e20ac101-1bcb-4ca7-8a77-e827c5eb6383"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327825 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327862 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0" (UID: "2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.327929 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d" (UID: "9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.328400 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/980cd36a-7926-48a8-9749-559317eeee7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "980cd36a-7926-48a8-9749-559317eeee7f" (UID: "980cd36a-7926-48a8-9749-559317eeee7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.328529 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d78f46-553d-47a0-a433-445b66500e1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4d78f46-553d-47a0-a433-445b66500e1c" (UID: "d4d78f46-553d-47a0-a433-445b66500e1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.330526 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d78f46-553d-47a0-a433-445b66500e1c-kube-api-access-c9w89" (OuterVolumeSpecName: "kube-api-access-c9w89") pod "d4d78f46-553d-47a0-a433-445b66500e1c" (UID: "d4d78f46-553d-47a0-a433-445b66500e1c"). InnerVolumeSpecName "kube-api-access-c9w89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.331036 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-kube-api-access-wzmd4" (OuterVolumeSpecName: "kube-api-access-wzmd4") pod "e20ac101-1bcb-4ca7-8a77-e827c5eb6383" (UID: "e20ac101-1bcb-4ca7-8a77-e827c5eb6383"). InnerVolumeSpecName "kube-api-access-wzmd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.331202 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-kube-api-access-thlk9" (OuterVolumeSpecName: "kube-api-access-thlk9") pod "2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0" (UID: "2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0"). InnerVolumeSpecName "kube-api-access-thlk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.331535 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980cd36a-7926-48a8-9749-559317eeee7f-kube-api-access-dczw5" (OuterVolumeSpecName: "kube-api-access-dczw5") pod "980cd36a-7926-48a8-9749-559317eeee7f" (UID: "980cd36a-7926-48a8-9749-559317eeee7f"). InnerVolumeSpecName "kube-api-access-dczw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.331989 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-kube-api-access-2r4t7" (OuterVolumeSpecName: "kube-api-access-2r4t7") pod "9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d" (UID: "9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d"). InnerVolumeSpecName "kube-api-access-2r4t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429463 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9w89\" (UniqueName: \"kubernetes.io/projected/d4d78f46-553d-47a0-a433-445b66500e1c-kube-api-access-c9w89\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429509 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d78f46-553d-47a0-a433-445b66500e1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429521 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dczw5\" (UniqueName: \"kubernetes.io/projected/980cd36a-7926-48a8-9749-559317eeee7f-kube-api-access-dczw5\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429532 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzmd4\" (UniqueName: \"kubernetes.io/projected/e20ac101-1bcb-4ca7-8a77-e827c5eb6383-kube-api-access-wzmd4\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429543 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r4t7\" (UniqueName: \"kubernetes.io/projected/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-kube-api-access-2r4t7\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429553 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/980cd36a-7926-48a8-9749-559317eeee7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429564 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thlk9\" (UniqueName: \"kubernetes.io/projected/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-kube-api-access-thlk9\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429575 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.429585 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.693362 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9ba2-account-create-update-2225k" event={"ID":"980cd36a-7926-48a8-9749-559317eeee7f","Type":"ContainerDied","Data":"1460a2a0861accd472952afd99dadc76404586ae3335130010d03ef1574b60cc"} Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.693396 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1460a2a0861accd472952afd99dadc76404586ae3335130010d03ef1574b60cc" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.693452 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9ba2-account-create-update-2225k" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.695218 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cq5lc" event={"ID":"9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d","Type":"ContainerDied","Data":"c766f2e7e9b137c02840f8bc114babce75be46451b44f2bb5b9e05d0f74bf459"} Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.695245 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c766f2e7e9b137c02840f8bc114babce75be46451b44f2bb5b9e05d0f74bf459" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.695289 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cq5lc" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.700980 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dcc5-account-create-update-lx5zl" event={"ID":"d4d78f46-553d-47a0-a433-445b66500e1c","Type":"ContainerDied","Data":"7ea22bcee73b87eb6d3b70ae5db0c86fb86da51f3d2f5902c1f1c32049495ad0"} Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.701009 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea22bcee73b87eb6d3b70ae5db0c86fb86da51f3d2f5902c1f1c32049495ad0" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.701032 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dcc5-account-create-update-lx5zl" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.704361 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sm8ns" event={"ID":"2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0","Type":"ContainerDied","Data":"88f8d2c5b68c0a11cf6393996c89b58d529d5a3bafd1cfa475a8108537ec53f3"} Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.704383 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f8d2c5b68c0a11cf6393996c89b58d529d5a3bafd1cfa475a8108537ec53f3" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.704471 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sm8ns" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.705827 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c869-account-create-update-dwt2f" event={"ID":"e20ac101-1bcb-4ca7-8a77-e827c5eb6383","Type":"ContainerDied","Data":"a22c22690888ce152129c479868f590c078360f0f1973e2d66016c8cfc98eded"} Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.705850 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22c22690888ce152129c479868f590c078360f0f1973e2d66016c8cfc98eded" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.705877 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c869-account-create-update-dwt2f" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.709254 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2j8s" event={"ID":"16d1ec63-1e07-4430-84f6-6f356d6cb420","Type":"ContainerDied","Data":"1637ac2230fbe66a7ab91718178c98df73dc453879e491e567d12d62b06c7769"} Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.709300 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1637ac2230fbe66a7ab91718178c98df73dc453879e491e567d12d62b06c7769" Dec 16 07:11:01 crc kubenswrapper[4789]: I1216 07:11:01.709302 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2j8s" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.126447 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" path="/var/lib/kubelet/pods/b50fd3ab-7d0c-438a-8de8-1001ab7209e3/volumes" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.653940 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9wqnt"] Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654247 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d1ec63-1e07-4430-84f6-6f356d6cb420" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654263 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d1ec63-1e07-4430-84f6-6f356d6cb420" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654294 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d78f46-553d-47a0-a433-445b66500e1c" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654301 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d78f46-553d-47a0-a433-445b66500e1c" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654312 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="980cd36a-7926-48a8-9749-559317eeee7f" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654319 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="980cd36a-7926-48a8-9749-559317eeee7f" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654337 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654342 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654360 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerName="init" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654366 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerName="init" Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654373 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerName="dnsmasq-dns" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654379 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerName="dnsmasq-dns" Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654397 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20ac101-1bcb-4ca7-8a77-e827c5eb6383" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654403 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20ac101-1bcb-4ca7-8a77-e827c5eb6383" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: E1216 07:11:02.654416 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654422 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654580 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654592 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="980cd36a-7926-48a8-9749-559317eeee7f" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654604 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20ac101-1bcb-4ca7-8a77-e827c5eb6383" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654615 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50fd3ab-7d0c-438a-8de8-1001ab7209e3" containerName="dnsmasq-dns" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654622 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d1ec63-1e07-4430-84f6-6f356d6cb420" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654632 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d78f46-553d-47a0-a433-445b66500e1c" containerName="mariadb-account-create-update" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.654643 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d" containerName="mariadb-database-create" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.655187 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.656861 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.657097 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-l26rp" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.665724 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9wqnt"] Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.754022 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm57j\" (UniqueName: \"kubernetes.io/projected/11ebd2c4-dad5-403a-aa60-77241f62af72-kube-api-access-rm57j\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.754423 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-db-sync-config-data\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.754453 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-combined-ca-bundle\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.754476 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-config-data\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.856611 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-db-sync-config-data\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.856676 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-combined-ca-bundle\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.856711 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-config-data\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.856783 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm57j\" (UniqueName: \"kubernetes.io/projected/11ebd2c4-dad5-403a-aa60-77241f62af72-kube-api-access-rm57j\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.862821 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-db-sync-config-data\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.862953 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-config-data\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.864338 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-combined-ca-bundle\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.875974 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm57j\" (UniqueName: \"kubernetes.io/projected/11ebd2c4-dad5-403a-aa60-77241f62af72-kube-api-access-rm57j\") pod \"glance-db-sync-9wqnt\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:02 crc kubenswrapper[4789]: I1216 07:11:02.969774 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:03 crc kubenswrapper[4789]: I1216 07:11:03.466111 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9wqnt"] Dec 16 07:11:03 crc kubenswrapper[4789]: I1216 07:11:03.736378 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wqnt" event={"ID":"11ebd2c4-dad5-403a-aa60-77241f62af72","Type":"ContainerStarted","Data":"12c46a2c528c414291d2389b175182fe4c7e513ca84b8525455f55bb75a23bf5"} Dec 16 07:11:06 crc kubenswrapper[4789]: I1216 07:11:06.006376 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:11:06 crc kubenswrapper[4789]: E1216 07:11:06.006616 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:11:06 crc kubenswrapper[4789]: E1216 07:11:06.006827 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 07:11:06 crc kubenswrapper[4789]: E1216 07:11:06.006885 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift podName:cbd6bd33-5f98-4eb6-9fee-5080941ee4c0 nodeName:}" failed. No retries permitted until 2025-12-16 07:11:22.006865998 +0000 UTC m=+1220.268753627 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift") pod "swift-storage-0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0") : configmap "swift-ring-files" not found Dec 16 07:11:09 crc kubenswrapper[4789]: I1216 07:11:09.265551 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 16 07:11:11 crc kubenswrapper[4789]: I1216 07:11:11.808908 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0a1b402-4791-402a-aa3e-b7f400007ac2" containerID="6b9b43486b394346d4ab06609223e399c69302b8c06ad02c18fe02f7f5d8d2d8" exitCode=0 Dec 16 07:11:11 crc kubenswrapper[4789]: I1216 07:11:11.809034 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zk74x" event={"ID":"a0a1b402-4791-402a-aa3e-b7f400007ac2","Type":"ContainerDied","Data":"6b9b43486b394346d4ab06609223e399c69302b8c06ad02c18fe02f7f5d8d2d8"} Dec 16 07:11:14 crc kubenswrapper[4789]: I1216 07:11:14.828625 4789 generic.go:334] "Generic (PLEG): container finished" podID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerID="ff767a7cabcaa4f9752eac58d5657fbc09c94d5629fc004f9ab8e06f780b0a62" exitCode=0 Dec 16 07:11:14 crc kubenswrapper[4789]: I1216 07:11:14.828666 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31336d9f-38cf-4805-927b-3ae986f6c88e","Type":"ContainerDied","Data":"ff767a7cabcaa4f9752eac58d5657fbc09c94d5629fc004f9ab8e06f780b0a62"} Dec 16 07:11:14 crc kubenswrapper[4789]: I1216 07:11:14.830422 4789 generic.go:334] "Generic (PLEG): container finished" podID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerID="e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae" exitCode=0 Dec 16 07:11:14 crc kubenswrapper[4789]: I1216 07:11:14.830451 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9452e1b2-42ec-47b6-96e1-2770c9e76db2","Type":"ContainerDied","Data":"e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae"} Dec 16 07:11:17 crc kubenswrapper[4789]: I1216 07:11:17.102961 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-cw7z9" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" probeResult="failure" output=< Dec 16 07:11:17 crc kubenswrapper[4789]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 07:11:17 crc kubenswrapper[4789]: > Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.166785 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.311240 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-scripts\") pod \"a0a1b402-4791-402a-aa3e-b7f400007ac2\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.311306 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0a1b402-4791-402a-aa3e-b7f400007ac2-etc-swift\") pod \"a0a1b402-4791-402a-aa3e-b7f400007ac2\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.312456 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a1b402-4791-402a-aa3e-b7f400007ac2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a0a1b402-4791-402a-aa3e-b7f400007ac2" (UID: "a0a1b402-4791-402a-aa3e-b7f400007ac2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.312605 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-swiftconf\") pod \"a0a1b402-4791-402a-aa3e-b7f400007ac2\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.312648 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-ring-data-devices\") pod \"a0a1b402-4791-402a-aa3e-b7f400007ac2\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.312677 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwhmh\" (UniqueName: \"kubernetes.io/projected/a0a1b402-4791-402a-aa3e-b7f400007ac2-kube-api-access-hwhmh\") pod \"a0a1b402-4791-402a-aa3e-b7f400007ac2\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.312706 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-dispersionconf\") pod \"a0a1b402-4791-402a-aa3e-b7f400007ac2\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.312731 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-combined-ca-bundle\") pod \"a0a1b402-4791-402a-aa3e-b7f400007ac2\" (UID: \"a0a1b402-4791-402a-aa3e-b7f400007ac2\") " Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.313252 4789 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0a1b402-4791-402a-aa3e-b7f400007ac2-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.314216 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a0a1b402-4791-402a-aa3e-b7f400007ac2" (UID: "a0a1b402-4791-402a-aa3e-b7f400007ac2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.318384 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a1b402-4791-402a-aa3e-b7f400007ac2-kube-api-access-hwhmh" (OuterVolumeSpecName: "kube-api-access-hwhmh") pod "a0a1b402-4791-402a-aa3e-b7f400007ac2" (UID: "a0a1b402-4791-402a-aa3e-b7f400007ac2"). InnerVolumeSpecName "kube-api-access-hwhmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.320374 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a0a1b402-4791-402a-aa3e-b7f400007ac2" (UID: "a0a1b402-4791-402a-aa3e-b7f400007ac2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.331726 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-scripts" (OuterVolumeSpecName: "scripts") pod "a0a1b402-4791-402a-aa3e-b7f400007ac2" (UID: "a0a1b402-4791-402a-aa3e-b7f400007ac2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.336622 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a0a1b402-4791-402a-aa3e-b7f400007ac2" (UID: "a0a1b402-4791-402a-aa3e-b7f400007ac2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.338863 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0a1b402-4791-402a-aa3e-b7f400007ac2" (UID: "a0a1b402-4791-402a-aa3e-b7f400007ac2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.415359 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.415387 4789 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.415400 4789 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0a1b402-4791-402a-aa3e-b7f400007ac2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.415414 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwhmh\" (UniqueName: \"kubernetes.io/projected/a0a1b402-4791-402a-aa3e-b7f400007ac2-kube-api-access-hwhmh\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.415425 4789 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.415435 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a1b402-4791-402a-aa3e-b7f400007ac2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.865511 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31336d9f-38cf-4805-927b-3ae986f6c88e","Type":"ContainerStarted","Data":"238b569af7959004c01bd0394274b3ef8d6991fd0c3fdae6cc211fa624cb5354"} Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.866054 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.867529 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zk74x" event={"ID":"a0a1b402-4791-402a-aa3e-b7f400007ac2","Type":"ContainerDied","Data":"23d7efc16b36291185c1d2ab0e1197a02d4dd26b8e2d758a415937de4620905d"} Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.867565 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23d7efc16b36291185c1d2ab0e1197a02d4dd26b8e2d758a415937de4620905d" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.867541 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zk74x" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.875322 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wqnt" event={"ID":"11ebd2c4-dad5-403a-aa60-77241f62af72","Type":"ContainerStarted","Data":"49ce1da469ce6b385a94da8426c33871259532e85a44ed87376c8c1678c0c690"} Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.882516 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9452e1b2-42ec-47b6-96e1-2770c9e76db2","Type":"ContainerStarted","Data":"fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362"} Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.882983 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.910752 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.54631547 podStartE2EDuration="1m6.910730426s" podCreationTimestamp="2025-12-16 07:10:12 +0000 UTC" firstStartedPulling="2025-12-16 07:10:15.200714001 +0000 UTC m=+1153.462601630" lastFinishedPulling="2025-12-16 07:10:40.565128957 +0000 UTC m=+1178.827016586" observedRunningTime="2025-12-16 07:11:18.895015433 +0000 UTC m=+1217.156903082" watchObservedRunningTime="2025-12-16 07:11:18.910730426 +0000 UTC m=+1217.172618055" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.919069 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9wqnt" podStartSLOduration=2.308020697 podStartE2EDuration="16.919050499s" podCreationTimestamp="2025-12-16 07:11:02 +0000 UTC" firstStartedPulling="2025-12-16 07:11:03.477302487 +0000 UTC m=+1201.739190116" lastFinishedPulling="2025-12-16 07:11:18.088332299 +0000 UTC m=+1216.350219918" observedRunningTime="2025-12-16 07:11:18.912194591 +0000 UTC m=+1217.174082220" watchObservedRunningTime="2025-12-16 07:11:18.919050499 +0000 UTC m=+1217.180938128" Dec 16 07:11:18 crc kubenswrapper[4789]: I1216 07:11:18.941935 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.238985998 podStartE2EDuration="1m6.941903966s" podCreationTimestamp="2025-12-16 07:10:12 +0000 UTC" firstStartedPulling="2025-12-16 07:10:14.534741876 +0000 UTC m=+1152.796629525" lastFinishedPulling="2025-12-16 07:10:39.237659864 +0000 UTC m=+1177.499547493" observedRunningTime="2025-12-16 07:11:18.939884686 +0000 UTC m=+1217.201772355" watchObservedRunningTime="2025-12-16 07:11:18.941903966 +0000 UTC m=+1217.203791595" Dec 16 07:11:21 crc kubenswrapper[4789]: I1216 07:11:21.927556 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:11:21 crc kubenswrapper[4789]: I1216 07:11:21.928010 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.080365 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-cw7z9" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" probeResult="failure" output=< Dec 16 07:11:22 crc kubenswrapper[4789]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 07:11:22 crc kubenswrapper[4789]: > Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.080395 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.096862 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"swift-storage-0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " pod="openstack/swift-storage-0" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.197925 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.201887 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.215637 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.466153 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cw7z9-config-4l46k"] Dec 16 07:11:22 crc kubenswrapper[4789]: E1216 07:11:22.466845 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a1b402-4791-402a-aa3e-b7f400007ac2" containerName="swift-ring-rebalance" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.466868 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a1b402-4791-402a-aa3e-b7f400007ac2" containerName="swift-ring-rebalance" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.467071 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a1b402-4791-402a-aa3e-b7f400007ac2" containerName="swift-ring-rebalance" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.468496 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.470511 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.493184 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9-config-4l46k"] Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.605694 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-scripts\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.605750 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.605787 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-additional-scripts\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.606047 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2w9w\" (UniqueName: \"kubernetes.io/projected/020e93cc-eea9-47e1-a918-81a480863e2e-kube-api-access-f2w9w\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.606287 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run-ovn\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.606425 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-log-ovn\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708547 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run-ovn\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708633 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-log-ovn\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708677 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-scripts\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708728 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708780 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-additional-scripts\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708817 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2w9w\" (UniqueName: \"kubernetes.io/projected/020e93cc-eea9-47e1-a918-81a480863e2e-kube-api-access-f2w9w\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708890 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708898 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-log-ovn\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.708989 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run-ovn\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.709730 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-additional-scripts\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.710766 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-scripts\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.725519 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2w9w\" (UniqueName: \"kubernetes.io/projected/020e93cc-eea9-47e1-a918-81a480863e2e-kube-api-access-f2w9w\") pod \"ovn-controller-cw7z9-config-4l46k\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.791510 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:22 crc kubenswrapper[4789]: I1216 07:11:22.906202 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:11:22 crc kubenswrapper[4789]: W1216 07:11:22.913528 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbd6bd33_5f98_4eb6_9fee_5080941ee4c0.slice/crio-aa014e0d284e6ea46e4838e5e30274d9085c41e7e66e8dced92e2bba1d40352d WatchSource:0}: Error finding container aa014e0d284e6ea46e4838e5e30274d9085c41e7e66e8dced92e2bba1d40352d: Status 404 returned error can't find the container with id aa014e0d284e6ea46e4838e5e30274d9085c41e7e66e8dced92e2bba1d40352d Dec 16 07:11:23 crc kubenswrapper[4789]: I1216 07:11:23.142349 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9-config-4l46k"] Dec 16 07:11:23 crc kubenswrapper[4789]: I1216 07:11:23.926806 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-4l46k" event={"ID":"020e93cc-eea9-47e1-a918-81a480863e2e","Type":"ContainerStarted","Data":"1a69f192606363373bec9813a6b1bcee5852eefa076923d229a54ab7daf7d583"} Dec 16 07:11:23 crc kubenswrapper[4789]: I1216 07:11:23.927585 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-4l46k" event={"ID":"020e93cc-eea9-47e1-a918-81a480863e2e","Type":"ContainerStarted","Data":"634c9ee722e5ebb0daa27e7d90e860460e30a33963e9568bb600988c5d216bf6"} Dec 16 07:11:23 crc kubenswrapper[4789]: I1216 07:11:23.928151 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"aa014e0d284e6ea46e4838e5e30274d9085c41e7e66e8dced92e2bba1d40352d"} Dec 16 07:11:24 crc kubenswrapper[4789]: I1216 07:11:24.937561 4789 generic.go:334] "Generic (PLEG): container finished" podID="020e93cc-eea9-47e1-a918-81a480863e2e" containerID="1a69f192606363373bec9813a6b1bcee5852eefa076923d229a54ab7daf7d583" exitCode=0 Dec 16 07:11:24 crc kubenswrapper[4789]: I1216 07:11:24.937667 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-4l46k" event={"ID":"020e93cc-eea9-47e1-a918-81a480863e2e","Type":"ContainerDied","Data":"1a69f192606363373bec9813a6b1bcee5852eefa076923d229a54ab7daf7d583"} Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.330542 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.481480 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-scripts\") pod \"020e93cc-eea9-47e1-a918-81a480863e2e\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.481547 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-additional-scripts\") pod \"020e93cc-eea9-47e1-a918-81a480863e2e\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.481568 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-log-ovn\") pod \"020e93cc-eea9-47e1-a918-81a480863e2e\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.481740 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run-ovn\") pod \"020e93cc-eea9-47e1-a918-81a480863e2e\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.481794 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2w9w\" (UniqueName: \"kubernetes.io/projected/020e93cc-eea9-47e1-a918-81a480863e2e-kube-api-access-f2w9w\") pod \"020e93cc-eea9-47e1-a918-81a480863e2e\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.481820 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run\") pod \"020e93cc-eea9-47e1-a918-81a480863e2e\" (UID: \"020e93cc-eea9-47e1-a918-81a480863e2e\") " Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.481975 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "020e93cc-eea9-47e1-a918-81a480863e2e" (UID: "020e93cc-eea9-47e1-a918-81a480863e2e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.482031 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "020e93cc-eea9-47e1-a918-81a480863e2e" (UID: "020e93cc-eea9-47e1-a918-81a480863e2e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.482270 4789 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.482299 4789 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.482307 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "020e93cc-eea9-47e1-a918-81a480863e2e" (UID: "020e93cc-eea9-47e1-a918-81a480863e2e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.482353 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run" (OuterVolumeSpecName: "var-run") pod "020e93cc-eea9-47e1-a918-81a480863e2e" (UID: "020e93cc-eea9-47e1-a918-81a480863e2e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.482614 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-scripts" (OuterVolumeSpecName: "scripts") pod "020e93cc-eea9-47e1-a918-81a480863e2e" (UID: "020e93cc-eea9-47e1-a918-81a480863e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.488411 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020e93cc-eea9-47e1-a918-81a480863e2e-kube-api-access-f2w9w" (OuterVolumeSpecName: "kube-api-access-f2w9w") pod "020e93cc-eea9-47e1-a918-81a480863e2e" (UID: "020e93cc-eea9-47e1-a918-81a480863e2e"). InnerVolumeSpecName "kube-api-access-f2w9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.588378 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2w9w\" (UniqueName: \"kubernetes.io/projected/020e93cc-eea9-47e1-a918-81a480863e2e-kube-api-access-f2w9w\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.588670 4789 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/020e93cc-eea9-47e1-a918-81a480863e2e-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.588681 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.588692 4789 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/020e93cc-eea9-47e1-a918-81a480863e2e-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.955128 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-4l46k" event={"ID":"020e93cc-eea9-47e1-a918-81a480863e2e","Type":"ContainerDied","Data":"634c9ee722e5ebb0daa27e7d90e860460e30a33963e9568bb600988c5d216bf6"} Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.955149 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-4l46k" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.955179 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="634c9ee722e5ebb0daa27e7d90e860460e30a33963e9568bb600988c5d216bf6" Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.958462 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"391f051ceefce6af95f3e5e5fc2ba9a787ede01ec802f107f998941a77f4283e"} Dec 16 07:11:26 crc kubenswrapper[4789]: I1216 07:11:26.958525 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"7dd74cf2b547abd9c20fc6d29daa7d954817be3444474dc3629c37701cc99230"} Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.091501 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-cw7z9" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.446119 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-cw7z9-config-4l46k"] Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.454110 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-cw7z9-config-4l46k"] Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.481513 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cw7z9-config-9444p"] Dec 16 07:11:27 crc kubenswrapper[4789]: E1216 07:11:27.481881 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e93cc-eea9-47e1-a918-81a480863e2e" containerName="ovn-config" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.481989 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e93cc-eea9-47e1-a918-81a480863e2e" containerName="ovn-config" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.482229 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e93cc-eea9-47e1-a918-81a480863e2e" containerName="ovn-config" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.482781 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.489664 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.505632 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9-config-9444p"] Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.507375 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run-ovn\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.507697 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-log-ovn\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.507809 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.507833 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnvf\" (UniqueName: \"kubernetes.io/projected/3cd1e986-c9a7-4027-8347-13e915282bce-kube-api-access-nbnvf\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.507857 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-scripts\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.508054 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-additional-scripts\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.609313 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-log-ovn\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.609399 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.609429 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbnvf\" (UniqueName: \"kubernetes.io/projected/3cd1e986-c9a7-4027-8347-13e915282bce-kube-api-access-nbnvf\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.609460 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-scripts\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.609564 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-additional-scripts\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.609609 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run-ovn\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.609760 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run-ovn\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.610127 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.610574 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-additional-scripts\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.612737 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-scripts\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.612931 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-log-ovn\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.641102 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbnvf\" (UniqueName: \"kubernetes.io/projected/3cd1e986-c9a7-4027-8347-13e915282bce-kube-api-access-nbnvf\") pod \"ovn-controller-cw7z9-config-9444p\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.800580 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.968411 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"20593004d226e1585979c62630548d692855df2932aab4c7c86476377d9cc2cc"} Dec 16 07:11:27 crc kubenswrapper[4789]: I1216 07:11:27.968453 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"b93f37726e0744613bc7b449e38506e91bd311f3c6efe8bbf38923fdf51b2146"} Dec 16 07:11:28 crc kubenswrapper[4789]: I1216 07:11:28.113747 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020e93cc-eea9-47e1-a918-81a480863e2e" path="/var/lib/kubelet/pods/020e93cc-eea9-47e1-a918-81a480863e2e/volumes" Dec 16 07:11:28 crc kubenswrapper[4789]: I1216 07:11:28.218796 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9-config-9444p"] Dec 16 07:11:28 crc kubenswrapper[4789]: I1216 07:11:28.979064 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"7428e5236584f2fe103930cb1f61dd303456f8c0deb11b5bbb601d51deecfb66"} Dec 16 07:11:28 crc kubenswrapper[4789]: I1216 07:11:28.985891 4789 generic.go:334] "Generic (PLEG): container finished" podID="3cd1e986-c9a7-4027-8347-13e915282bce" containerID="a4e1972471943947df87b1d22c704377beeab31b729a1c467937dfb3523caf4d" exitCode=0 Dec 16 07:11:28 crc kubenswrapper[4789]: I1216 07:11:28.985941 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-9444p" event={"ID":"3cd1e986-c9a7-4027-8347-13e915282bce","Type":"ContainerDied","Data":"a4e1972471943947df87b1d22c704377beeab31b729a1c467937dfb3523caf4d"} Dec 16 07:11:28 crc kubenswrapper[4789]: I1216 07:11:28.985981 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-9444p" event={"ID":"3cd1e986-c9a7-4027-8347-13e915282bce","Type":"ContainerStarted","Data":"0562a869e5e6e5dd254900cf981ef526370af07a640c25465120e43b24e0d861"} Dec 16 07:11:29 crc kubenswrapper[4789]: I1216 07:11:29.998638 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"e0c8a6f56c8022db43b02bf2bd015331c0cdd2235c3eca42b9e1e1f7f8bd3705"} Dec 16 07:11:29 crc kubenswrapper[4789]: I1216 07:11:29.998685 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"edcd02c79a5409469199dea08015de9c6eeffbea5566bd3cd4db97a260e47fdd"} Dec 16 07:11:29 crc kubenswrapper[4789]: I1216 07:11:29.998695 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"ec371978a44bd2c62cd3ea38c393bf36090b055edd6151b95aa9b353fbdb7387"} Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.624011 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.694050 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run-ovn\") pod \"3cd1e986-c9a7-4027-8347-13e915282bce\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.694505 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run\") pod \"3cd1e986-c9a7-4027-8347-13e915282bce\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.694653 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-additional-scripts\") pod \"3cd1e986-c9a7-4027-8347-13e915282bce\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.694779 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-scripts\") pod \"3cd1e986-c9a7-4027-8347-13e915282bce\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.694900 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbnvf\" (UniqueName: \"kubernetes.io/projected/3cd1e986-c9a7-4027-8347-13e915282bce-kube-api-access-nbnvf\") pod \"3cd1e986-c9a7-4027-8347-13e915282bce\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.695103 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-log-ovn\") pod \"3cd1e986-c9a7-4027-8347-13e915282bce\" (UID: \"3cd1e986-c9a7-4027-8347-13e915282bce\") " Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.695544 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3cd1e986-c9a7-4027-8347-13e915282bce" (UID: "3cd1e986-c9a7-4027-8347-13e915282bce"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.695696 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3cd1e986-c9a7-4027-8347-13e915282bce" (UID: "3cd1e986-c9a7-4027-8347-13e915282bce"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.695901 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run" (OuterVolumeSpecName: "var-run") pod "3cd1e986-c9a7-4027-8347-13e915282bce" (UID: "3cd1e986-c9a7-4027-8347-13e915282bce"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.696951 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3cd1e986-c9a7-4027-8347-13e915282bce" (UID: "3cd1e986-c9a7-4027-8347-13e915282bce"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.697866 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-scripts" (OuterVolumeSpecName: "scripts") pod "3cd1e986-c9a7-4027-8347-13e915282bce" (UID: "3cd1e986-c9a7-4027-8347-13e915282bce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.704102 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd1e986-c9a7-4027-8347-13e915282bce-kube-api-access-nbnvf" (OuterVolumeSpecName: "kube-api-access-nbnvf") pod "3cd1e986-c9a7-4027-8347-13e915282bce" (UID: "3cd1e986-c9a7-4027-8347-13e915282bce"). InnerVolumeSpecName "kube-api-access-nbnvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.796255 4789 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.796294 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cd1e986-c9a7-4027-8347-13e915282bce-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.796306 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbnvf\" (UniqueName: \"kubernetes.io/projected/3cd1e986-c9a7-4027-8347-13e915282bce-kube-api-access-nbnvf\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.796319 4789 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.796331 4789 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:32 crc kubenswrapper[4789]: I1216 07:11:32.796342 4789 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cd1e986-c9a7-4027-8347-13e915282bce-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.019544 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-9444p" event={"ID":"3cd1e986-c9a7-4027-8347-13e915282bce","Type":"ContainerDied","Data":"0562a869e5e6e5dd254900cf981ef526370af07a640c25465120e43b24e0d861"} Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.019582 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0562a869e5e6e5dd254900cf981ef526370af07a640c25465120e43b24e0d861" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.019938 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-9444p" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.699711 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-cw7z9-config-9444p"] Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.710627 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-cw7z9-config-9444p"] Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.756402 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cw7z9-config-h9msv"] Dec 16 07:11:33 crc kubenswrapper[4789]: E1216 07:11:33.756973 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd1e986-c9a7-4027-8347-13e915282bce" containerName="ovn-config" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.757013 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd1e986-c9a7-4027-8347-13e915282bce" containerName="ovn-config" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.757283 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd1e986-c9a7-4027-8347-13e915282bce" containerName="ovn-config" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.758075 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.764531 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.766658 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9-config-h9msv"] Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.811267 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-log-ovn\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.811340 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-scripts\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.811461 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-additional-scripts\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.811491 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.811617 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run-ovn\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.811645 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27cqf\" (UniqueName: \"kubernetes.io/projected/6aa9cb74-679f-43d5-818a-3887a7f7987b-kube-api-access-27cqf\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.912778 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-additional-scripts\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.912822 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.912897 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run-ovn\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.912927 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27cqf\" (UniqueName: \"kubernetes.io/projected/6aa9cb74-679f-43d5-818a-3887a7f7987b-kube-api-access-27cqf\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.912943 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-log-ovn\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.912973 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-scripts\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.913228 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.913228 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run-ovn\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.913229 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-log-ovn\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.913666 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-additional-scripts\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.914837 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-scripts\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.948794 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27cqf\" (UniqueName: \"kubernetes.io/projected/6aa9cb74-679f-43d5-818a-3887a7f7987b-kube-api-access-27cqf\") pod \"ovn-controller-cw7z9-config-h9msv\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:33 crc kubenswrapper[4789]: I1216 07:11:33.998132 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:11:34 crc kubenswrapper[4789]: I1216 07:11:34.038137 4789 generic.go:334] "Generic (PLEG): container finished" podID="11ebd2c4-dad5-403a-aa60-77241f62af72" containerID="49ce1da469ce6b385a94da8426c33871259532e85a44ed87376c8c1678c0c690" exitCode=0 Dec 16 07:11:34 crc kubenswrapper[4789]: I1216 07:11:34.038182 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wqnt" event={"ID":"11ebd2c4-dad5-403a-aa60-77241f62af72","Type":"ContainerDied","Data":"49ce1da469ce6b385a94da8426c33871259532e85a44ed87376c8c1678c0c690"} Dec 16 07:11:34 crc kubenswrapper[4789]: I1216 07:11:34.085177 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:34 crc kubenswrapper[4789]: I1216 07:11:34.115801 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd1e986-c9a7-4027-8347-13e915282bce" path="/var/lib/kubelet/pods/3cd1e986-c9a7-4027-8347-13e915282bce/volumes" Dec 16 07:11:34 crc kubenswrapper[4789]: I1216 07:11:34.402115 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 07:11:34 crc kubenswrapper[4789]: I1216 07:11:34.818083 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cw7z9-config-h9msv"] Dec 16 07:11:34 crc kubenswrapper[4789]: W1216 07:11:34.830841 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aa9cb74_679f_43d5_818a_3887a7f7987b.slice/crio-09b571b00cab68852983e04f24de5b4e2d037a7d46e31fced9f32b1812988df1 WatchSource:0}: Error finding container 09b571b00cab68852983e04f24de5b4e2d037a7d46e31fced9f32b1812988df1: Status 404 returned error can't find the container with id 09b571b00cab68852983e04f24de5b4e2d037a7d46e31fced9f32b1812988df1 Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.052222 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-h9msv" event={"ID":"6aa9cb74-679f-43d5-818a-3887a7f7987b","Type":"ContainerStarted","Data":"09b571b00cab68852983e04f24de5b4e2d037a7d46e31fced9f32b1812988df1"} Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.056511 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"bf3fe2408d858c60b990dfb63b6c210d31747a7a36a94cb83c07d547d090370f"} Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.056563 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"dfd251c4b8cc4551da74250c7e1018cc05d1c34c1749b00d7314e5704a70d11c"} Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.056580 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"c05e3cfb0b0446d45c6b1efc03786be1905a9914fbbf8eca279bc89ee3642716"} Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.626291 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.739550 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-db-sync-config-data\") pod \"11ebd2c4-dad5-403a-aa60-77241f62af72\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.739659 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-config-data\") pod \"11ebd2c4-dad5-403a-aa60-77241f62af72\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.739689 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-combined-ca-bundle\") pod \"11ebd2c4-dad5-403a-aa60-77241f62af72\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.739707 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm57j\" (UniqueName: \"kubernetes.io/projected/11ebd2c4-dad5-403a-aa60-77241f62af72-kube-api-access-rm57j\") pod \"11ebd2c4-dad5-403a-aa60-77241f62af72\" (UID: \"11ebd2c4-dad5-403a-aa60-77241f62af72\") " Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.745868 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "11ebd2c4-dad5-403a-aa60-77241f62af72" (UID: "11ebd2c4-dad5-403a-aa60-77241f62af72"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.745891 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ebd2c4-dad5-403a-aa60-77241f62af72-kube-api-access-rm57j" (OuterVolumeSpecName: "kube-api-access-rm57j") pod "11ebd2c4-dad5-403a-aa60-77241f62af72" (UID: "11ebd2c4-dad5-403a-aa60-77241f62af72"). InnerVolumeSpecName "kube-api-access-rm57j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.784892 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11ebd2c4-dad5-403a-aa60-77241f62af72" (UID: "11ebd2c4-dad5-403a-aa60-77241f62af72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.786091 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-config-data" (OuterVolumeSpecName: "config-data") pod "11ebd2c4-dad5-403a-aa60-77241f62af72" (UID: "11ebd2c4-dad5-403a-aa60-77241f62af72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.841947 4789 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.841983 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.841991 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11ebd2c4-dad5-403a-aa60-77241f62af72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:35 crc kubenswrapper[4789]: I1216 07:11:35.842000 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm57j\" (UniqueName: \"kubernetes.io/projected/11ebd2c4-dad5-403a-aa60-77241f62af72-kube-api-access-rm57j\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.050804 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5vnz5"] Dec 16 07:11:36 crc kubenswrapper[4789]: E1216 07:11:36.051497 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ebd2c4-dad5-403a-aa60-77241f62af72" containerName="glance-db-sync" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.051519 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ebd2c4-dad5-403a-aa60-77241f62af72" containerName="glance-db-sync" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.051732 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ebd2c4-dad5-403a-aa60-77241f62af72" containerName="glance-db-sync" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.052409 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.075698 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5vnz5"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.134244 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"abff080aef14c07b0b737efd0a65faff826c48715b5f1c2ab9b91640d17f6623"} Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.134289 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"d8238af7dbf15f23415f0c86259fcf9957fbc0b08bcb581d4f0624333c152ec1"} Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.134303 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"58f8b4cb7ddbfc39c3c2c236d8c52319b46445fa6bd8e36d14a249780702ad85"} Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.134316 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerStarted","Data":"7f58e5c14558f31f6600906b48eb2e6f74d0e6249665f123eef015ba515b9e8b"} Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.136804 4789 generic.go:334] "Generic (PLEG): container finished" podID="6aa9cb74-679f-43d5-818a-3887a7f7987b" containerID="8db8c91f3588978f5cc5cf7aa8691f106f9c0d392f047d1074534065dd1f409d" exitCode=0 Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.136881 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-h9msv" event={"ID":"6aa9cb74-679f-43d5-818a-3887a7f7987b","Type":"ContainerDied","Data":"8db8c91f3588978f5cc5cf7aa8691f106f9c0d392f047d1074534065dd1f409d"} Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.139001 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9wqnt" event={"ID":"11ebd2c4-dad5-403a-aa60-77241f62af72","Type":"ContainerDied","Data":"12c46a2c528c414291d2389b175182fe4c7e513ca84b8525455f55bb75a23bf5"} Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.139033 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12c46a2c528c414291d2389b175182fe4c7e513ca84b8525455f55bb75a23bf5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.139107 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9wqnt" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.145571 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f98c\" (UniqueName: \"kubernetes.io/projected/bb595623-26e8-470c-bfa0-565282778cbb-kube-api-access-4f98c\") pod \"cinder-db-create-5vnz5\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.145637 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb595623-26e8-470c-bfa0-565282778cbb-operator-scripts\") pod \"cinder-db-create-5vnz5\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.183793 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-9cs4h"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.184749 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.197834 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9a65-account-create-update-ghjsp"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.199042 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.201287 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.215410 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9cs4h"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.222112 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a65-account-create-update-ghjsp"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.246856 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f98c\" (UniqueName: \"kubernetes.io/projected/bb595623-26e8-470c-bfa0-565282778cbb-kube-api-access-4f98c\") pod \"cinder-db-create-5vnz5\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.246954 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb595623-26e8-470c-bfa0-565282778cbb-operator-scripts\") pod \"cinder-db-create-5vnz5\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.248879 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb595623-26e8-470c-bfa0-565282778cbb-operator-scripts\") pod \"cinder-db-create-5vnz5\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.285519 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.846270132 podStartE2EDuration="47.285503551s" podCreationTimestamp="2025-12-16 07:10:49 +0000 UTC" firstStartedPulling="2025-12-16 07:11:22.915565702 +0000 UTC m=+1221.177453341" lastFinishedPulling="2025-12-16 07:11:34.354799131 +0000 UTC m=+1232.616686760" observedRunningTime="2025-12-16 07:11:36.278130071 +0000 UTC m=+1234.540017700" watchObservedRunningTime="2025-12-16 07:11:36.285503551 +0000 UTC m=+1234.547391180" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.294758 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f98c\" (UniqueName: \"kubernetes.io/projected/bb595623-26e8-470c-bfa0-565282778cbb-kube-api-access-4f98c\") pod \"cinder-db-create-5vnz5\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.348727 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-operator-scripts\") pod \"cinder-9a65-account-create-update-ghjsp\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.348860 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6v4\" (UniqueName: \"kubernetes.io/projected/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-kube-api-access-hw6v4\") pod \"cinder-9a65-account-create-update-ghjsp\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.349108 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8b69c5-5882-42d9-8154-1a39e0b55178-operator-scripts\") pod \"barbican-db-create-9cs4h\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.349191 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxnl\" (UniqueName: \"kubernetes.io/projected/7a8b69c5-5882-42d9-8154-1a39e0b55178-kube-api-access-cjxnl\") pod \"barbican-db-create-9cs4h\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.368784 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-30bf-account-create-update-cgmvn"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.369887 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.370130 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.373274 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.389432 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-30bf-account-create-update-cgmvn"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.452837 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2aa4a5-f152-4361-822c-a114f9b41b49-operator-scripts\") pod \"barbican-30bf-account-create-update-cgmvn\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.452888 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6v4\" (UniqueName: \"kubernetes.io/projected/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-kube-api-access-hw6v4\") pod \"cinder-9a65-account-create-update-ghjsp\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.452933 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgc5f\" (UniqueName: \"kubernetes.io/projected/8a2aa4a5-f152-4361-822c-a114f9b41b49-kube-api-access-zgc5f\") pod \"barbican-30bf-account-create-update-cgmvn\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.452989 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8b69c5-5882-42d9-8154-1a39e0b55178-operator-scripts\") pod \"barbican-db-create-9cs4h\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.453014 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxnl\" (UniqueName: \"kubernetes.io/projected/7a8b69c5-5882-42d9-8154-1a39e0b55178-kube-api-access-cjxnl\") pod \"barbican-db-create-9cs4h\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.453061 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-operator-scripts\") pod \"cinder-9a65-account-create-update-ghjsp\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.453905 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-operator-scripts\") pod \"cinder-9a65-account-create-update-ghjsp\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.454670 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8b69c5-5882-42d9-8154-1a39e0b55178-operator-scripts\") pod \"barbican-db-create-9cs4h\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.494089 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxnl\" (UniqueName: \"kubernetes.io/projected/7a8b69c5-5882-42d9-8154-1a39e0b55178-kube-api-access-cjxnl\") pod \"barbican-db-create-9cs4h\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.498500 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6v4\" (UniqueName: \"kubernetes.io/projected/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-kube-api-access-hw6v4\") pod \"cinder-9a65-account-create-update-ghjsp\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.504131 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fmwzn"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.504666 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.505231 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.516770 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.518968 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fmwzn"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.554672 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2aa4a5-f152-4361-822c-a114f9b41b49-operator-scripts\") pod \"barbican-30bf-account-create-update-cgmvn\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.555010 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgc5f\" (UniqueName: \"kubernetes.io/projected/8a2aa4a5-f152-4361-822c-a114f9b41b49-kube-api-access-zgc5f\") pod \"barbican-30bf-account-create-update-cgmvn\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.557101 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2aa4a5-f152-4361-822c-a114f9b41b49-operator-scripts\") pod \"barbican-30bf-account-create-update-cgmvn\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.560619 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5298-account-create-update-hrzjv"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.561649 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.563973 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.583177 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgc5f\" (UniqueName: \"kubernetes.io/projected/8a2aa4a5-f152-4361-822c-a114f9b41b49-kube-api-access-zgc5f\") pod \"barbican-30bf-account-create-update-cgmvn\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.602339 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5298-account-create-update-hrzjv"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.655020 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jlc8k"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.656136 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.656317 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfzgc\" (UniqueName: \"kubernetes.io/projected/9740f406-8da5-496b-a8c7-b0c7474fe4da-kube-api-access-qfzgc\") pod \"neutron-db-create-fmwzn\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.656386 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9740f406-8da5-496b-a8c7-b0c7474fe4da-operator-scripts\") pod \"neutron-db-create-fmwzn\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.656453 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15382adc-269b-498c-ae42-a5e8a681e386-operator-scripts\") pod \"neutron-5298-account-create-update-hrzjv\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.656481 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf645\" (UniqueName: \"kubernetes.io/projected/15382adc-269b-498c-ae42-a5e8a681e386-kube-api-access-bf645\") pod \"neutron-5298-account-create-update-hrzjv\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.660491 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.660673 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.660791 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.666181 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-mvvbr"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.667390 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.678701 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jlc8k"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.679364 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2cc6f" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.694414 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.712798 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-mvvbr"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760264 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cx2n\" (UniqueName: \"kubernetes.io/projected/05899547-935a-47e4-b055-fb03cf46afa8-kube-api-access-8cx2n\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760336 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15382adc-269b-498c-ae42-a5e8a681e386-operator-scripts\") pod \"neutron-5298-account-create-update-hrzjv\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760376 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760399 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf645\" (UniqueName: \"kubernetes.io/projected/15382adc-269b-498c-ae42-a5e8a681e386-kube-api-access-bf645\") pod \"neutron-5298-account-create-update-hrzjv\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760423 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760449 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-combined-ca-bundle\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760489 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfzgc\" (UniqueName: \"kubernetes.io/projected/9740f406-8da5-496b-a8c7-b0c7474fe4da-kube-api-access-qfzgc\") pod \"neutron-db-create-fmwzn\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760550 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9740f406-8da5-496b-a8c7-b0c7474fe4da-operator-scripts\") pod \"neutron-db-create-fmwzn\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760574 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-config\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760603 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-config-data\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760643 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fxvb\" (UniqueName: \"kubernetes.io/projected/08d5548c-14fe-416e-86d8-f6845cbcc57c-kube-api-access-5fxvb\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.760657 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-dns-svc\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.761440 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15382adc-269b-498c-ae42-a5e8a681e386-operator-scripts\") pod \"neutron-5298-account-create-update-hrzjv\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.764327 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9740f406-8da5-496b-a8c7-b0c7474fe4da-operator-scripts\") pod \"neutron-db-create-fmwzn\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.813765 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf645\" (UniqueName: \"kubernetes.io/projected/15382adc-269b-498c-ae42-a5e8a681e386-kube-api-access-bf645\") pod \"neutron-5298-account-create-update-hrzjv\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.829736 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-mvvbr"] Dec 16 07:11:36 crc kubenswrapper[4789]: E1216 07:11:36.830844 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-8cx2n ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" podUID="05899547-935a-47e4-b055-fb03cf46afa8" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.851576 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfzgc\" (UniqueName: \"kubernetes.io/projected/9740f406-8da5-496b-a8c7-b0c7474fe4da-kube-api-access-qfzgc\") pod \"neutron-db-create-fmwzn\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.863313 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865335 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fxvb\" (UniqueName: \"kubernetes.io/projected/08d5548c-14fe-416e-86d8-f6845cbcc57c-kube-api-access-5fxvb\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865372 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-dns-svc\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865419 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cx2n\" (UniqueName: \"kubernetes.io/projected/05899547-935a-47e4-b055-fb03cf46afa8-kube-api-access-8cx2n\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865480 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865500 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865522 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-combined-ca-bundle\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865653 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-config\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.865685 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-config-data\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.872874 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-dns-svc\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.873564 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-sb\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.874061 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-nb\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.875870 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-config\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.891403 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-config-data\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.897943 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-combined-ca-bundle\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.918664 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.921234 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cx2n\" (UniqueName: \"kubernetes.io/projected/05899547-935a-47e4-b055-fb03cf46afa8-kube-api-access-8cx2n\") pod \"dnsmasq-dns-6bfd654465-mvvbr\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.956199 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fxvb\" (UniqueName: \"kubernetes.io/projected/08d5548c-14fe-416e-86d8-f6845cbcc57c-kube-api-access-5fxvb\") pod \"keystone-db-sync-jlc8k\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.963142 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-6vb5v"] Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.964675 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.970939 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 16 07:11:36 crc kubenswrapper[4789]: I1216 07:11:36.984735 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-6vb5v"] Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:36.995560 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.070007 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.070082 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-config\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.070122 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.070180 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.070196 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.070220 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v859p\" (UniqueName: \"kubernetes.io/projected/81bd104f-cabd-425e-960d-32a7c8f65d4d-kube-api-access-v859p\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.144482 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5vnz5"] Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.174300 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.174395 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.174416 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.174438 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v859p\" (UniqueName: \"kubernetes.io/projected/81bd104f-cabd-425e-960d-32a7c8f65d4d-kube-api-access-v859p\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.174506 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.174558 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-config\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.179048 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.183413 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-config\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.186186 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-svc\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.186906 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-nb\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.190080 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-sb\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.190636 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-swift-storage-0\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.212771 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.230187 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v859p\" (UniqueName: \"kubernetes.io/projected/81bd104f-cabd-425e-960d-32a7c8f65d4d-kube-api-access-v859p\") pod \"dnsmasq-dns-74dfc89d77-6vb5v\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.248512 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.277530 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-sb\") pod \"05899547-935a-47e4-b055-fb03cf46afa8\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.277578 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-nb\") pod \"05899547-935a-47e4-b055-fb03cf46afa8\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.277627 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-config\") pod \"05899547-935a-47e4-b055-fb03cf46afa8\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.277702 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cx2n\" (UniqueName: \"kubernetes.io/projected/05899547-935a-47e4-b055-fb03cf46afa8-kube-api-access-8cx2n\") pod \"05899547-935a-47e4-b055-fb03cf46afa8\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.277726 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-dns-svc\") pod \"05899547-935a-47e4-b055-fb03cf46afa8\" (UID: \"05899547-935a-47e4-b055-fb03cf46afa8\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.279563 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "05899547-935a-47e4-b055-fb03cf46afa8" (UID: "05899547-935a-47e4-b055-fb03cf46afa8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.280276 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "05899547-935a-47e4-b055-fb03cf46afa8" (UID: "05899547-935a-47e4-b055-fb03cf46afa8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.280606 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-config" (OuterVolumeSpecName: "config") pod "05899547-935a-47e4-b055-fb03cf46afa8" (UID: "05899547-935a-47e4-b055-fb03cf46afa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.281859 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05899547-935a-47e4-b055-fb03cf46afa8" (UID: "05899547-935a-47e4-b055-fb03cf46afa8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.296061 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05899547-935a-47e4-b055-fb03cf46afa8-kube-api-access-8cx2n" (OuterVolumeSpecName: "kube-api-access-8cx2n") pod "05899547-935a-47e4-b055-fb03cf46afa8" (UID: "05899547-935a-47e4-b055-fb03cf46afa8"). InnerVolumeSpecName "kube-api-access-8cx2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.382064 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.382086 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.382095 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.382107 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cx2n\" (UniqueName: \"kubernetes.io/projected/05899547-935a-47e4-b055-fb03cf46afa8-kube-api-access-8cx2n\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.382118 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05899547-935a-47e4-b055-fb03cf46afa8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.623452 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9cs4h"] Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.714819 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9a65-account-create-update-ghjsp"] Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.738269 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:37 crc kubenswrapper[4789]: W1216 07:11:37.759184 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf70f2c3_1a4b_44e8_87e7_1d03a302998d.slice/crio-dc8ca5cc8fa85435a91b70bcc5bd369b35224699fc296fa5ac64a9a3760744db WatchSource:0}: Error finding container dc8ca5cc8fa85435a91b70bcc5bd369b35224699fc296fa5ac64a9a3760744db: Status 404 returned error can't find the container with id dc8ca5cc8fa85435a91b70bcc5bd369b35224699fc296fa5ac64a9a3760744db Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.790039 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-additional-scripts\") pod \"6aa9cb74-679f-43d5-818a-3887a7f7987b\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.791108 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27cqf\" (UniqueName: \"kubernetes.io/projected/6aa9cb74-679f-43d5-818a-3887a7f7987b-kube-api-access-27cqf\") pod \"6aa9cb74-679f-43d5-818a-3887a7f7987b\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.791014 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6aa9cb74-679f-43d5-818a-3887a7f7987b" (UID: "6aa9cb74-679f-43d5-818a-3887a7f7987b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.791180 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-log-ovn\") pod \"6aa9cb74-679f-43d5-818a-3887a7f7987b\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.791616 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run\") pod \"6aa9cb74-679f-43d5-818a-3887a7f7987b\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.791672 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run-ovn\") pod \"6aa9cb74-679f-43d5-818a-3887a7f7987b\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.791738 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-scripts\") pod \"6aa9cb74-679f-43d5-818a-3887a7f7987b\" (UID: \"6aa9cb74-679f-43d5-818a-3887a7f7987b\") " Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.792103 4789 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.793155 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6aa9cb74-679f-43d5-818a-3887a7f7987b" (UID: "6aa9cb74-679f-43d5-818a-3887a7f7987b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.793205 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6aa9cb74-679f-43d5-818a-3887a7f7987b" (UID: "6aa9cb74-679f-43d5-818a-3887a7f7987b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.793250 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run" (OuterVolumeSpecName: "var-run") pod "6aa9cb74-679f-43d5-818a-3887a7f7987b" (UID: "6aa9cb74-679f-43d5-818a-3887a7f7987b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.794657 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-scripts" (OuterVolumeSpecName: "scripts") pod "6aa9cb74-679f-43d5-818a-3887a7f7987b" (UID: "6aa9cb74-679f-43d5-818a-3887a7f7987b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.801721 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa9cb74-679f-43d5-818a-3887a7f7987b-kube-api-access-27cqf" (OuterVolumeSpecName: "kube-api-access-27cqf") pod "6aa9cb74-679f-43d5-818a-3887a7f7987b" (UID: "6aa9cb74-679f-43d5-818a-3887a7f7987b"). InnerVolumeSpecName "kube-api-access-27cqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.847447 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fmwzn"] Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.861756 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-30bf-account-create-update-cgmvn"] Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.869553 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jlc8k"] Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.895436 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aa9cb74-679f-43d5-818a-3887a7f7987b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.895462 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27cqf\" (UniqueName: \"kubernetes.io/projected/6aa9cb74-679f-43d5-818a-3887a7f7987b-kube-api-access-27cqf\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.895474 4789 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.895485 4789 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:37 crc kubenswrapper[4789]: I1216 07:11:37.895494 4789 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6aa9cb74-679f-43d5-818a-3887a7f7987b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.062670 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-6vb5v"] Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.073026 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5298-account-create-update-hrzjv"] Dec 16 07:11:38 crc kubenswrapper[4789]: W1216 07:11:38.084176 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81bd104f_cabd_425e_960d_32a7c8f65d4d.slice/crio-66cbee8e02ca161764a8c175feaa5ba269650cc9922962c1ec547c3e4420705f WatchSource:0}: Error finding container 66cbee8e02ca161764a8c175feaa5ba269650cc9922962c1ec547c3e4420705f: Status 404 returned error can't find the container with id 66cbee8e02ca161764a8c175feaa5ba269650cc9922962c1ec547c3e4420705f Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.192590 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fmwzn" event={"ID":"9740f406-8da5-496b-a8c7-b0c7474fe4da","Type":"ContainerStarted","Data":"da5a6915508d90bfe15082a2fb62067ba26dbbbb68d156e2c71ab4d2a1fc12df"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.195566 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jlc8k" event={"ID":"08d5548c-14fe-416e-86d8-f6845cbcc57c","Type":"ContainerStarted","Data":"ca0339645f5e6c70fcadf3047ff4c8fa233feeb1a390453672fbccbeafb922eb"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.201825 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9cs4h" event={"ID":"7a8b69c5-5882-42d9-8154-1a39e0b55178","Type":"ContainerStarted","Data":"4d4f28d29d9e45c9d1a534d1b5655094d706f77fa22f2d906a42b362484f9d25"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.201868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9cs4h" event={"ID":"7a8b69c5-5882-42d9-8154-1a39e0b55178","Type":"ContainerStarted","Data":"6655396632c77116eb0f01b02646acaacc9f40127b9149879209ad3ef4697a0c"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.206702 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9-config-h9msv" event={"ID":"6aa9cb74-679f-43d5-818a-3887a7f7987b","Type":"ContainerDied","Data":"09b571b00cab68852983e04f24de5b4e2d037a7d46e31fced9f32b1812988df1"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.206742 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09b571b00cab68852983e04f24de5b4e2d037a7d46e31fced9f32b1812988df1" Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.206775 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9-config-h9msv" Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.211839 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a65-account-create-update-ghjsp" event={"ID":"cf70f2c3-1a4b-44e8-87e7-1d03a302998d","Type":"ContainerStarted","Data":"ccfba8128cb37296a7ad5efdd726c44e3192d20ed3ebc813659f9a92598d3b42"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.212134 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a65-account-create-update-ghjsp" event={"ID":"cf70f2c3-1a4b-44e8-87e7-1d03a302998d","Type":"ContainerStarted","Data":"dc8ca5cc8fa85435a91b70bcc5bd369b35224699fc296fa5ac64a9a3760744db"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.239825 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5vnz5" event={"ID":"bb595623-26e8-470c-bfa0-565282778cbb","Type":"ContainerStarted","Data":"fa4802d6b1f9b4ab40abb3aee482c4a9ba03e2786e17ce87dee90707888b166f"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.247342 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5vnz5" event={"ID":"bb595623-26e8-470c-bfa0-565282778cbb","Type":"ContainerStarted","Data":"5bfc70e95e71dcb578c02a28a3ef1b6e181e157ebb5fa2ef617fa3806cd05758"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.247356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-30bf-account-create-update-cgmvn" event={"ID":"8a2aa4a5-f152-4361-822c-a114f9b41b49","Type":"ContainerStarted","Data":"ccb53d055151efc5e5d8ff9b52ba0513daafc65717da8339d69f456f78579d90"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.247367 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" event={"ID":"81bd104f-cabd-425e-960d-32a7c8f65d4d","Type":"ContainerStarted","Data":"66cbee8e02ca161764a8c175feaa5ba269650cc9922962c1ec547c3e4420705f"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.247385 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5298-account-create-update-hrzjv" event={"ID":"15382adc-269b-498c-ae42-a5e8a681e386","Type":"ContainerStarted","Data":"4424b77d2e97301a32e81b8dceed27978cded296642ed625c34fb04efc6403b8"} Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.243830 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bfd654465-mvvbr" Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.277785 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-9cs4h" podStartSLOduration=2.277763702 podStartE2EDuration="2.277763702s" podCreationTimestamp="2025-12-16 07:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:11:38.236786573 +0000 UTC m=+1236.498674202" watchObservedRunningTime="2025-12-16 07:11:38.277763702 +0000 UTC m=+1236.539651331" Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.287979 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9a65-account-create-update-ghjsp" podStartSLOduration=2.2879592 podStartE2EDuration="2.2879592s" podCreationTimestamp="2025-12-16 07:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:11:38.25720358 +0000 UTC m=+1236.519091209" watchObservedRunningTime="2025-12-16 07:11:38.2879592 +0000 UTC m=+1236.549846849" Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.292356 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-5vnz5" podStartSLOduration=2.292340266 podStartE2EDuration="2.292340266s" podCreationTimestamp="2025-12-16 07:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:11:38.282933627 +0000 UTC m=+1236.544821256" watchObservedRunningTime="2025-12-16 07:11:38.292340266 +0000 UTC m=+1236.554227895" Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.371665 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-mvvbr"] Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.379733 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bfd654465-mvvbr"] Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.810371 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-cw7z9-config-h9msv"] Dec 16 07:11:38 crc kubenswrapper[4789]: I1216 07:11:38.819410 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-cw7z9-config-h9msv"] Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.272601 4789 generic.go:334] "Generic (PLEG): container finished" podID="bb595623-26e8-470c-bfa0-565282778cbb" containerID="fa4802d6b1f9b4ab40abb3aee482c4a9ba03e2786e17ce87dee90707888b166f" exitCode=0 Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.272686 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5vnz5" event={"ID":"bb595623-26e8-470c-bfa0-565282778cbb","Type":"ContainerDied","Data":"fa4802d6b1f9b4ab40abb3aee482c4a9ba03e2786e17ce87dee90707888b166f"} Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.297348 4789 generic.go:334] "Generic (PLEG): container finished" podID="8a2aa4a5-f152-4361-822c-a114f9b41b49" containerID="532fd11f65282fc52bc292b7aff8cda54f4fc5a01dcbc82485f8e436746b9749" exitCode=0 Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.297457 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-30bf-account-create-update-cgmvn" event={"ID":"8a2aa4a5-f152-4361-822c-a114f9b41b49","Type":"ContainerDied","Data":"532fd11f65282fc52bc292b7aff8cda54f4fc5a01dcbc82485f8e436746b9749"} Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.305509 4789 generic.go:334] "Generic (PLEG): container finished" podID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerID="ec2363db1b2e9737a3936198f633f4a6a9b92a07c065753b24d608fba64be02a" exitCode=0 Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.305585 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" event={"ID":"81bd104f-cabd-425e-960d-32a7c8f65d4d","Type":"ContainerDied","Data":"ec2363db1b2e9737a3936198f633f4a6a9b92a07c065753b24d608fba64be02a"} Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.308667 4789 generic.go:334] "Generic (PLEG): container finished" podID="7a8b69c5-5882-42d9-8154-1a39e0b55178" containerID="4d4f28d29d9e45c9d1a534d1b5655094d706f77fa22f2d906a42b362484f9d25" exitCode=0 Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.308743 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9cs4h" event={"ID":"7a8b69c5-5882-42d9-8154-1a39e0b55178","Type":"ContainerDied","Data":"4d4f28d29d9e45c9d1a534d1b5655094d706f77fa22f2d906a42b362484f9d25"} Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.353740 4789 generic.go:334] "Generic (PLEG): container finished" podID="15382adc-269b-498c-ae42-a5e8a681e386" containerID="5006dc4ca327d0dd04cdaac12e74e1592268701a5255897d3cbe17ad6fe5b2c2" exitCode=0 Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.353843 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5298-account-create-update-hrzjv" event={"ID":"15382adc-269b-498c-ae42-a5e8a681e386","Type":"ContainerDied","Data":"5006dc4ca327d0dd04cdaac12e74e1592268701a5255897d3cbe17ad6fe5b2c2"} Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.376123 4789 generic.go:334] "Generic (PLEG): container finished" podID="9740f406-8da5-496b-a8c7-b0c7474fe4da" containerID="38532090bd6f4cace6fd83a68fadf8e82aefac3dad9c257c36af02a4dd1033b5" exitCode=0 Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.376204 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fmwzn" event={"ID":"9740f406-8da5-496b-a8c7-b0c7474fe4da","Type":"ContainerDied","Data":"38532090bd6f4cace6fd83a68fadf8e82aefac3dad9c257c36af02a4dd1033b5"} Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.377994 4789 generic.go:334] "Generic (PLEG): container finished" podID="cf70f2c3-1a4b-44e8-87e7-1d03a302998d" containerID="ccfba8128cb37296a7ad5efdd726c44e3192d20ed3ebc813659f9a92598d3b42" exitCode=0 Dec 16 07:11:39 crc kubenswrapper[4789]: I1216 07:11:39.378043 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a65-account-create-update-ghjsp" event={"ID":"cf70f2c3-1a4b-44e8-87e7-1d03a302998d","Type":"ContainerDied","Data":"ccfba8128cb37296a7ad5efdd726c44e3192d20ed3ebc813659f9a92598d3b42"} Dec 16 07:11:40 crc kubenswrapper[4789]: I1216 07:11:40.115458 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05899547-935a-47e4-b055-fb03cf46afa8" path="/var/lib/kubelet/pods/05899547-935a-47e4-b055-fb03cf46afa8/volumes" Dec 16 07:11:40 crc kubenswrapper[4789]: I1216 07:11:40.116295 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa9cb74-679f-43d5-818a-3887a7f7987b" path="/var/lib/kubelet/pods/6aa9cb74-679f-43d5-818a-3887a7f7987b/volumes" Dec 16 07:11:40 crc kubenswrapper[4789]: I1216 07:11:40.389505 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" event={"ID":"81bd104f-cabd-425e-960d-32a7c8f65d4d","Type":"ContainerStarted","Data":"97d4496cdda1fcdfd06fca5f84fc0d6ebd0f02ae717d8b73550760003cfa5c08"} Dec 16 07:11:40 crc kubenswrapper[4789]: I1216 07:11:40.409169 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" podStartSLOduration=4.409152661 podStartE2EDuration="4.409152661s" podCreationTimestamp="2025-12-16 07:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:11:40.408821173 +0000 UTC m=+1238.670708802" watchObservedRunningTime="2025-12-16 07:11:40.409152661 +0000 UTC m=+1238.671040290" Dec 16 07:11:41 crc kubenswrapper[4789]: I1216 07:11:41.409326 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.829108 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.836578 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.842951 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.854076 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.876588 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.883045 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.925711 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15382adc-269b-498c-ae42-a5e8a681e386-operator-scripts\") pod \"15382adc-269b-498c-ae42-a5e8a681e386\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.925787 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9740f406-8da5-496b-a8c7-b0c7474fe4da-operator-scripts\") pod \"9740f406-8da5-496b-a8c7-b0c7474fe4da\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.925818 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-operator-scripts\") pod \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.925884 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f98c\" (UniqueName: \"kubernetes.io/projected/bb595623-26e8-470c-bfa0-565282778cbb-kube-api-access-4f98c\") pod \"bb595623-26e8-470c-bfa0-565282778cbb\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.925964 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjxnl\" (UniqueName: \"kubernetes.io/projected/7a8b69c5-5882-42d9-8154-1a39e0b55178-kube-api-access-cjxnl\") pod \"7a8b69c5-5882-42d9-8154-1a39e0b55178\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.925988 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw6v4\" (UniqueName: \"kubernetes.io/projected/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-kube-api-access-hw6v4\") pod \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\" (UID: \"cf70f2c3-1a4b-44e8-87e7-1d03a302998d\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.926038 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2aa4a5-f152-4361-822c-a114f9b41b49-operator-scripts\") pod \"8a2aa4a5-f152-4361-822c-a114f9b41b49\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.926083 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15382adc-269b-498c-ae42-a5e8a681e386-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15382adc-269b-498c-ae42-a5e8a681e386" (UID: "15382adc-269b-498c-ae42-a5e8a681e386"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.927138 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf645\" (UniqueName: \"kubernetes.io/projected/15382adc-269b-498c-ae42-a5e8a681e386-kube-api-access-bf645\") pod \"15382adc-269b-498c-ae42-a5e8a681e386\" (UID: \"15382adc-269b-498c-ae42-a5e8a681e386\") " Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.928066 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15382adc-269b-498c-ae42-a5e8a681e386-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.929865 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2aa4a5-f152-4361-822c-a114f9b41b49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a2aa4a5-f152-4361-822c-a114f9b41b49" (UID: "8a2aa4a5-f152-4361-822c-a114f9b41b49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.929892 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9740f406-8da5-496b-a8c7-b0c7474fe4da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9740f406-8da5-496b-a8c7-b0c7474fe4da" (UID: "9740f406-8da5-496b-a8c7-b0c7474fe4da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.929974 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf70f2c3-1a4b-44e8-87e7-1d03a302998d" (UID: "cf70f2c3-1a4b-44e8-87e7-1d03a302998d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.929988 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8b69c5-5882-42d9-8154-1a39e0b55178-kube-api-access-cjxnl" (OuterVolumeSpecName: "kube-api-access-cjxnl") pod "7a8b69c5-5882-42d9-8154-1a39e0b55178" (UID: "7a8b69c5-5882-42d9-8154-1a39e0b55178"). InnerVolumeSpecName "kube-api-access-cjxnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.930383 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb595623-26e8-470c-bfa0-565282778cbb-kube-api-access-4f98c" (OuterVolumeSpecName: "kube-api-access-4f98c") pod "bb595623-26e8-470c-bfa0-565282778cbb" (UID: "bb595623-26e8-470c-bfa0-565282778cbb"). InnerVolumeSpecName "kube-api-access-4f98c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.931074 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15382adc-269b-498c-ae42-a5e8a681e386-kube-api-access-bf645" (OuterVolumeSpecName: "kube-api-access-bf645") pod "15382adc-269b-498c-ae42-a5e8a681e386" (UID: "15382adc-269b-498c-ae42-a5e8a681e386"). InnerVolumeSpecName "kube-api-access-bf645". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:43 crc kubenswrapper[4789]: I1216 07:11:43.932093 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-kube-api-access-hw6v4" (OuterVolumeSpecName: "kube-api-access-hw6v4") pod "cf70f2c3-1a4b-44e8-87e7-1d03a302998d" (UID: "cf70f2c3-1a4b-44e8-87e7-1d03a302998d"). InnerVolumeSpecName "kube-api-access-hw6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.028986 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgc5f\" (UniqueName: \"kubernetes.io/projected/8a2aa4a5-f152-4361-822c-a114f9b41b49-kube-api-access-zgc5f\") pod \"8a2aa4a5-f152-4361-822c-a114f9b41b49\" (UID: \"8a2aa4a5-f152-4361-822c-a114f9b41b49\") " Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029105 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8b69c5-5882-42d9-8154-1a39e0b55178-operator-scripts\") pod \"7a8b69c5-5882-42d9-8154-1a39e0b55178\" (UID: \"7a8b69c5-5882-42d9-8154-1a39e0b55178\") " Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029128 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfzgc\" (UniqueName: \"kubernetes.io/projected/9740f406-8da5-496b-a8c7-b0c7474fe4da-kube-api-access-qfzgc\") pod \"9740f406-8da5-496b-a8c7-b0c7474fe4da\" (UID: \"9740f406-8da5-496b-a8c7-b0c7474fe4da\") " Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029145 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb595623-26e8-470c-bfa0-565282778cbb-operator-scripts\") pod \"bb595623-26e8-470c-bfa0-565282778cbb\" (UID: \"bb595623-26e8-470c-bfa0-565282778cbb\") " Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029367 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjxnl\" (UniqueName: \"kubernetes.io/projected/7a8b69c5-5882-42d9-8154-1a39e0b55178-kube-api-access-cjxnl\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029378 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw6v4\" (UniqueName: \"kubernetes.io/projected/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-kube-api-access-hw6v4\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029388 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2aa4a5-f152-4361-822c-a114f9b41b49-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029397 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf645\" (UniqueName: \"kubernetes.io/projected/15382adc-269b-498c-ae42-a5e8a681e386-kube-api-access-bf645\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029405 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9740f406-8da5-496b-a8c7-b0c7474fe4da-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029412 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf70f2c3-1a4b-44e8-87e7-1d03a302998d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.029420 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f98c\" (UniqueName: \"kubernetes.io/projected/bb595623-26e8-470c-bfa0-565282778cbb-kube-api-access-4f98c\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.030184 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb595623-26e8-470c-bfa0-565282778cbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb595623-26e8-470c-bfa0-565282778cbb" (UID: "bb595623-26e8-470c-bfa0-565282778cbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.030188 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8b69c5-5882-42d9-8154-1a39e0b55178-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a8b69c5-5882-42d9-8154-1a39e0b55178" (UID: "7a8b69c5-5882-42d9-8154-1a39e0b55178"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.033889 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2aa4a5-f152-4361-822c-a114f9b41b49-kube-api-access-zgc5f" (OuterVolumeSpecName: "kube-api-access-zgc5f") pod "8a2aa4a5-f152-4361-822c-a114f9b41b49" (UID: "8a2aa4a5-f152-4361-822c-a114f9b41b49"). InnerVolumeSpecName "kube-api-access-zgc5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.035032 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9740f406-8da5-496b-a8c7-b0c7474fe4da-kube-api-access-qfzgc" (OuterVolumeSpecName: "kube-api-access-qfzgc") pod "9740f406-8da5-496b-a8c7-b0c7474fe4da" (UID: "9740f406-8da5-496b-a8c7-b0c7474fe4da"). InnerVolumeSpecName "kube-api-access-qfzgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.130124 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a8b69c5-5882-42d9-8154-1a39e0b55178-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.130460 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfzgc\" (UniqueName: \"kubernetes.io/projected/9740f406-8da5-496b-a8c7-b0c7474fe4da-kube-api-access-qfzgc\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.130475 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb595623-26e8-470c-bfa0-565282778cbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.130485 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgc5f\" (UniqueName: \"kubernetes.io/projected/8a2aa4a5-f152-4361-822c-a114f9b41b49-kube-api-access-zgc5f\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.432931 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9cs4h" event={"ID":"7a8b69c5-5882-42d9-8154-1a39e0b55178","Type":"ContainerDied","Data":"6655396632c77116eb0f01b02646acaacc9f40127b9149879209ad3ef4697a0c"} Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.432973 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9cs4h" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.432983 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6655396632c77116eb0f01b02646acaacc9f40127b9149879209ad3ef4697a0c" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.436330 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5298-account-create-update-hrzjv" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.436369 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5298-account-create-update-hrzjv" event={"ID":"15382adc-269b-498c-ae42-a5e8a681e386","Type":"ContainerDied","Data":"4424b77d2e97301a32e81b8dceed27978cded296642ed625c34fb04efc6403b8"} Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.436437 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4424b77d2e97301a32e81b8dceed27978cded296642ed625c34fb04efc6403b8" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.455109 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fmwzn" event={"ID":"9740f406-8da5-496b-a8c7-b0c7474fe4da","Type":"ContainerDied","Data":"da5a6915508d90bfe15082a2fb62067ba26dbbbb68d156e2c71ab4d2a1fc12df"} Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.455329 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fmwzn" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.455348 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da5a6915508d90bfe15082a2fb62067ba26dbbbb68d156e2c71ab4d2a1fc12df" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.460568 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9a65-account-create-update-ghjsp" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.461415 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9a65-account-create-update-ghjsp" event={"ID":"cf70f2c3-1a4b-44e8-87e7-1d03a302998d","Type":"ContainerDied","Data":"dc8ca5cc8fa85435a91b70bcc5bd369b35224699fc296fa5ac64a9a3760744db"} Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.462160 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc8ca5cc8fa85435a91b70bcc5bd369b35224699fc296fa5ac64a9a3760744db" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.466954 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jlc8k" event={"ID":"08d5548c-14fe-416e-86d8-f6845cbcc57c","Type":"ContainerStarted","Data":"3de1d641f5bc055659878f3fb9702aef4d0f671e418ce7cddb37b9a7b2ceb48b"} Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.473120 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5vnz5" event={"ID":"bb595623-26e8-470c-bfa0-565282778cbb","Type":"ContainerDied","Data":"5bfc70e95e71dcb578c02a28a3ef1b6e181e157ebb5fa2ef617fa3806cd05758"} Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.473162 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bfc70e95e71dcb578c02a28a3ef1b6e181e157ebb5fa2ef617fa3806cd05758" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.473243 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5vnz5" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.478847 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-30bf-account-create-update-cgmvn" event={"ID":"8a2aa4a5-f152-4361-822c-a114f9b41b49","Type":"ContainerDied","Data":"ccb53d055151efc5e5d8ff9b52ba0513daafc65717da8339d69f456f78579d90"} Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.478892 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccb53d055151efc5e5d8ff9b52ba0513daafc65717da8339d69f456f78579d90" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.478984 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-30bf-account-create-update-cgmvn" Dec 16 07:11:44 crc kubenswrapper[4789]: I1216 07:11:44.495751 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jlc8k" podStartSLOduration=2.493378864 podStartE2EDuration="8.495733778s" podCreationTimestamp="2025-12-16 07:11:36 +0000 UTC" firstStartedPulling="2025-12-16 07:11:37.896357828 +0000 UTC m=+1236.158245457" lastFinishedPulling="2025-12-16 07:11:43.898712742 +0000 UTC m=+1242.160600371" observedRunningTime="2025-12-16 07:11:44.49047956 +0000 UTC m=+1242.752367209" watchObservedRunningTime="2025-12-16 07:11:44.495733778 +0000 UTC m=+1242.757621407" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.250131 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.311989 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-v84p9"] Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.312359 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" podUID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerName="dnsmasq-dns" containerID="cri-o://88365ed41bbf83111859483b5a0e5bb3068071855ecbc6908269bc8d5048fcce" gracePeriod=10 Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.505305 4789 generic.go:334] "Generic (PLEG): container finished" podID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerID="88365ed41bbf83111859483b5a0e5bb3068071855ecbc6908269bc8d5048fcce" exitCode=0 Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.505386 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" event={"ID":"7766e284-61b1-4146-b6a7-e45e8eb1772d","Type":"ContainerDied","Data":"88365ed41bbf83111859483b5a0e5bb3068071855ecbc6908269bc8d5048fcce"} Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.784363 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.799562 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-nb\") pod \"7766e284-61b1-4146-b6a7-e45e8eb1772d\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.799811 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-sb\") pod \"7766e284-61b1-4146-b6a7-e45e8eb1772d\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.799962 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-645qz\" (UniqueName: \"kubernetes.io/projected/7766e284-61b1-4146-b6a7-e45e8eb1772d-kube-api-access-645qz\") pod \"7766e284-61b1-4146-b6a7-e45e8eb1772d\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.808770 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7766e284-61b1-4146-b6a7-e45e8eb1772d-kube-api-access-645qz" (OuterVolumeSpecName: "kube-api-access-645qz") pod "7766e284-61b1-4146-b6a7-e45e8eb1772d" (UID: "7766e284-61b1-4146-b6a7-e45e8eb1772d"). InnerVolumeSpecName "kube-api-access-645qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.863746 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7766e284-61b1-4146-b6a7-e45e8eb1772d" (UID: "7766e284-61b1-4146-b6a7-e45e8eb1772d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.873341 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7766e284-61b1-4146-b6a7-e45e8eb1772d" (UID: "7766e284-61b1-4146-b6a7-e45e8eb1772d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.901117 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-dns-svc\") pod \"7766e284-61b1-4146-b6a7-e45e8eb1772d\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.901169 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-config\") pod \"7766e284-61b1-4146-b6a7-e45e8eb1772d\" (UID: \"7766e284-61b1-4146-b6a7-e45e8eb1772d\") " Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.901389 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.901402 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.901414 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-645qz\" (UniqueName: \"kubernetes.io/projected/7766e284-61b1-4146-b6a7-e45e8eb1772d-kube-api-access-645qz\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.946704 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7766e284-61b1-4146-b6a7-e45e8eb1772d" (UID: "7766e284-61b1-4146-b6a7-e45e8eb1772d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:47 crc kubenswrapper[4789]: I1216 07:11:47.948210 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-config" (OuterVolumeSpecName: "config") pod "7766e284-61b1-4146-b6a7-e45e8eb1772d" (UID: "7766e284-61b1-4146-b6a7-e45e8eb1772d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.003404 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.003461 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7766e284-61b1-4146-b6a7-e45e8eb1772d-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.515355 4789 generic.go:334] "Generic (PLEG): container finished" podID="08d5548c-14fe-416e-86d8-f6845cbcc57c" containerID="3de1d641f5bc055659878f3fb9702aef4d0f671e418ce7cddb37b9a7b2ceb48b" exitCode=0 Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.515397 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jlc8k" event={"ID":"08d5548c-14fe-416e-86d8-f6845cbcc57c","Type":"ContainerDied","Data":"3de1d641f5bc055659878f3fb9702aef4d0f671e418ce7cddb37b9a7b2ceb48b"} Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.517763 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" event={"ID":"7766e284-61b1-4146-b6a7-e45e8eb1772d","Type":"ContainerDied","Data":"d3fddf36060a0024ff9a939e135d11916914106e132b30c78776fc77c6c474a2"} Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.517814 4789 scope.go:117] "RemoveContainer" containerID="88365ed41bbf83111859483b5a0e5bb3068071855ecbc6908269bc8d5048fcce" Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.517827 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-v84p9" Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.550445 4789 scope.go:117] "RemoveContainer" containerID="dd6706f3e45c930b5e7382399b5ad642b49d3db8f65af9441ec6627714a15acb" Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.552143 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-v84p9"] Dec 16 07:11:48 crc kubenswrapper[4789]: I1216 07:11:48.559300 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-v84p9"] Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.821806 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.830954 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-combined-ca-bundle\") pod \"08d5548c-14fe-416e-86d8-f6845cbcc57c\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.831119 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fxvb\" (UniqueName: \"kubernetes.io/projected/08d5548c-14fe-416e-86d8-f6845cbcc57c-kube-api-access-5fxvb\") pod \"08d5548c-14fe-416e-86d8-f6845cbcc57c\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.831207 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-config-data\") pod \"08d5548c-14fe-416e-86d8-f6845cbcc57c\" (UID: \"08d5548c-14fe-416e-86d8-f6845cbcc57c\") " Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.852137 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d5548c-14fe-416e-86d8-f6845cbcc57c-kube-api-access-5fxvb" (OuterVolumeSpecName: "kube-api-access-5fxvb") pod "08d5548c-14fe-416e-86d8-f6845cbcc57c" (UID: "08d5548c-14fe-416e-86d8-f6845cbcc57c"). InnerVolumeSpecName "kube-api-access-5fxvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.871297 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08d5548c-14fe-416e-86d8-f6845cbcc57c" (UID: "08d5548c-14fe-416e-86d8-f6845cbcc57c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.892661 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-config-data" (OuterVolumeSpecName: "config-data") pod "08d5548c-14fe-416e-86d8-f6845cbcc57c" (UID: "08d5548c-14fe-416e-86d8-f6845cbcc57c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.932903 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.932946 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d5548c-14fe-416e-86d8-f6845cbcc57c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:49 crc kubenswrapper[4789]: I1216 07:11:49.932956 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fxvb\" (UniqueName: \"kubernetes.io/projected/08d5548c-14fe-416e-86d8-f6845cbcc57c-kube-api-access-5fxvb\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.114389 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7766e284-61b1-4146-b6a7-e45e8eb1772d" path="/var/lib/kubelet/pods/7766e284-61b1-4146-b6a7-e45e8eb1772d/volumes" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.536370 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jlc8k" event={"ID":"08d5548c-14fe-416e-86d8-f6845cbcc57c","Type":"ContainerDied","Data":"ca0339645f5e6c70fcadf3047ff4c8fa233feeb1a390453672fbccbeafb922eb"} Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.536407 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca0339645f5e6c70fcadf3047ff4c8fa233feeb1a390453672fbccbeafb922eb" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.536457 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jlc8k" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774238 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b792l"] Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774548 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerName="init" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774564 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerName="init" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774575 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf70f2c3-1a4b-44e8-87e7-1d03a302998d" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774582 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf70f2c3-1a4b-44e8-87e7-1d03a302998d" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774591 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb595623-26e8-470c-bfa0-565282778cbb" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774597 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb595623-26e8-470c-bfa0-565282778cbb" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774609 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa9cb74-679f-43d5-818a-3887a7f7987b" containerName="ovn-config" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774614 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa9cb74-679f-43d5-818a-3887a7f7987b" containerName="ovn-config" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774639 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15382adc-269b-498c-ae42-a5e8a681e386" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774645 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="15382adc-269b-498c-ae42-a5e8a681e386" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774653 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d5548c-14fe-416e-86d8-f6845cbcc57c" containerName="keystone-db-sync" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774660 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d5548c-14fe-416e-86d8-f6845cbcc57c" containerName="keystone-db-sync" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774669 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8b69c5-5882-42d9-8154-1a39e0b55178" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774675 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8b69c5-5882-42d9-8154-1a39e0b55178" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774686 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9740f406-8da5-496b-a8c7-b0c7474fe4da" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774691 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9740f406-8da5-496b-a8c7-b0c7474fe4da" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774700 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerName="dnsmasq-dns" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774706 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerName="dnsmasq-dns" Dec 16 07:11:50 crc kubenswrapper[4789]: E1216 07:11:50.774720 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2aa4a5-f152-4361-822c-a114f9b41b49" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774726 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2aa4a5-f152-4361-822c-a114f9b41b49" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774866 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb595623-26e8-470c-bfa0-565282778cbb" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774878 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7766e284-61b1-4146-b6a7-e45e8eb1772d" containerName="dnsmasq-dns" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774888 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="15382adc-269b-498c-ae42-a5e8a681e386" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774896 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa9cb74-679f-43d5-818a-3887a7f7987b" containerName="ovn-config" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774904 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9740f406-8da5-496b-a8c7-b0c7474fe4da" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774928 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8b69c5-5882-42d9-8154-1a39e0b55178" containerName="mariadb-database-create" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774940 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d5548c-14fe-416e-86d8-f6845cbcc57c" containerName="keystone-db-sync" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774964 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2aa4a5-f152-4361-822c-a114f9b41b49" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.774971 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf70f2c3-1a4b-44e8-87e7-1d03a302998d" containerName="mariadb-account-create-update" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.775438 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.778224 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.778462 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2cc6f" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.784310 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.784563 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.784738 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.800970 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-q5tbh"] Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.802584 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.825352 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b792l"] Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.841872 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-q5tbh"] Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845531 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-scripts\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845563 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845597 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-fernet-keys\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845614 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-credential-keys\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845643 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjn8p\" (UniqueName: \"kubernetes.io/projected/3459a829-6581-4ef6-a90e-342e4f3a138e-kube-api-access-tjn8p\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845676 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-combined-ca-bundle\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845696 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-config\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845722 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845803 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh884\" (UniqueName: \"kubernetes.io/projected/e60ab97a-b1b7-4e70-87ca-efe218d234ce-kube-api-access-zh884\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845821 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-config-data\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845839 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.845860 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947033 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-scripts\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947079 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947106 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-fernet-keys\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947126 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-credential-keys\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947150 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjn8p\" (UniqueName: \"kubernetes.io/projected/3459a829-6581-4ef6-a90e-342e4f3a138e-kube-api-access-tjn8p\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947177 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-combined-ca-bundle\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947196 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-config\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947217 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947275 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh884\" (UniqueName: \"kubernetes.io/projected/e60ab97a-b1b7-4e70-87ca-efe218d234ce-kube-api-access-zh884\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947290 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-config-data\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947306 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.947326 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.948291 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-swift-storage-0\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.951991 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-config\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.952722 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-svc\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.953289 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.953829 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.959770 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-combined-ca-bundle\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.960028 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-scripts\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.966439 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-credential-keys\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.966822 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-config-data\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:50 crc kubenswrapper[4789]: I1216 07:11:50.966840 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-fernet-keys\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.035860 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjn8p\" (UniqueName: \"kubernetes.io/projected/3459a829-6581-4ef6-a90e-342e4f3a138e-kube-api-access-tjn8p\") pod \"dnsmasq-dns-5fdbfbc95f-q5tbh\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.036465 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh884\" (UniqueName: \"kubernetes.io/projected/e60ab97a-b1b7-4e70-87ca-efe218d234ce-kube-api-access-zh884\") pod \"keystone-bootstrap-b792l\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.105283 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b792l" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.134325 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.171681 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-glqgh"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.183321 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.206284 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.206637 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nn726" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.206703 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.246986 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-glqgh"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.295044 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.297172 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.311830 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.312868 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.331891 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375061 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375118 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfn7\" (UniqueName: \"kubernetes.io/projected/ea93c850-0d3d-42f5-9e00-340ea2398cdd-kube-api-access-qmfn7\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375168 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-scripts\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375208 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-scripts\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375243 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-db-sync-config-data\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375266 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56gtt\" (UniqueName: \"kubernetes.io/projected/5d632824-4eaa-4698-b244-88872be244b8-kube-api-access-56gtt\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375286 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375318 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-config-data\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375339 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375360 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-combined-ca-bundle\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375380 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375406 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-config-data\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.375431 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d632824-4eaa-4698-b244-88872be244b8-etc-machine-id\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.424420 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-bk864"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.425643 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.445639 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.446748 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.446867 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nr77q" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.457843 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9jvbz"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.458945 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.462113 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-km9xh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.462421 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477591 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-config-data\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477624 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477646 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-config-data\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477664 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-combined-ca-bundle\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477679 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477699 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qc7\" (UniqueName: \"kubernetes.io/projected/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-kube-api-access-b8qc7\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477719 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-config-data\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477741 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d632824-4eaa-4698-b244-88872be244b8-etc-machine-id\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477848 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-db-sync-config-data\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477885 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477932 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfn7\" (UniqueName: \"kubernetes.io/projected/ea93c850-0d3d-42f5-9e00-340ea2398cdd-kube-api-access-qmfn7\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477948 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-logs\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.477979 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-scripts\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478015 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kvjt\" (UniqueName: \"kubernetes.io/projected/4284636e-4d98-4efc-a75a-18eada4a3a8d-kube-api-access-2kvjt\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478033 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-scripts\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478060 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-db-sync-config-data\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478083 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478098 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56gtt\" (UniqueName: \"kubernetes.io/projected/5d632824-4eaa-4698-b244-88872be244b8-kube-api-access-56gtt\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478119 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-combined-ca-bundle\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478134 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-scripts\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.478151 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-combined-ca-bundle\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.491036 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d632824-4eaa-4698-b244-88872be244b8-etc-machine-id\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.491789 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.492026 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.499634 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.506808 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-config-data\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.509316 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-db-sync-config-data\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.513419 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-combined-ca-bundle\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.514118 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-scripts\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.515418 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.519409 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfn7\" (UniqueName: \"kubernetes.io/projected/ea93c850-0d3d-42f5-9e00-340ea2398cdd-kube-api-access-qmfn7\") pod \"ceilometer-0\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.521162 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-config-data\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.521609 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-scripts\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.551455 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bk864"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.564749 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56gtt\" (UniqueName: \"kubernetes.io/projected/5d632824-4eaa-4698-b244-88872be244b8-kube-api-access-56gtt\") pod \"cinder-db-sync-glqgh\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579295 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kxd22"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579684 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-combined-ca-bundle\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579721 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-scripts\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579740 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-combined-ca-bundle\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579768 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-config-data\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579788 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qc7\" (UniqueName: \"kubernetes.io/projected/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-kube-api-access-b8qc7\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579818 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-db-sync-config-data\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579865 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-logs\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.579940 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kvjt\" (UniqueName: \"kubernetes.io/projected/4284636e-4d98-4efc-a75a-18eada4a3a8d-kube-api-access-2kvjt\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.581197 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.584571 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-logs\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.586151 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.586390 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.586651 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vlvtd" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.591302 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kxd22"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.592117 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-combined-ca-bundle\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.592265 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-config-data\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.594017 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-combined-ca-bundle\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.597673 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-scripts\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.599292 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jvbz"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.601974 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qc7\" (UniqueName: \"kubernetes.io/projected/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-kube-api-access-b8qc7\") pod \"placement-db-sync-bk864\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.609335 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kvjt\" (UniqueName: \"kubernetes.io/projected/4284636e-4d98-4efc-a75a-18eada4a3a8d-kube-api-access-2kvjt\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.611490 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-db-sync-config-data\") pod \"barbican-db-sync-9jvbz\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.632000 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.641075 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-q5tbh"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.649522 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-klxrg"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.651355 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.657163 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-klxrg"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.783821 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.783909 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-config\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.783962 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-config\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.783994 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.784029 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.784067 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-combined-ca-bundle\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.784108 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwbdr\" (UniqueName: \"kubernetes.io/projected/67a83e3d-660c-40f0-893c-e8476053df0c-kube-api-access-wwbdr\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.784140 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.784201 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbxs\" (UniqueName: \"kubernetes.io/projected/3f721de0-e915-40f9-9444-f3135f39072c-kube-api-access-2zbxs\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.790323 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bk864" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.828997 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-glqgh" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.855179 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.858439 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-q5tbh"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890333 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-config\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890373 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-config\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890402 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890428 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890454 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-combined-ca-bundle\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890479 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwbdr\" (UniqueName: \"kubernetes.io/projected/67a83e3d-660c-40f0-893c-e8476053df0c-kube-api-access-wwbdr\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890518 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890559 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbxs\" (UniqueName: \"kubernetes.io/projected/3f721de0-e915-40f9-9444-f3135f39072c-kube-api-access-2zbxs\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.890586 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.891575 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-svc\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.892276 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.893546 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-config\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.896626 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.897251 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-combined-ca-bundle\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: W1216 07:11:51.897327 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3459a829_6581_4ef6_a90e_342e4f3a138e.slice/crio-e739fd1bb950ed904aac06445ed552967913e491844084ff549e6516d9cc0b97 WatchSource:0}: Error finding container e739fd1bb950ed904aac06445ed552967913e491844084ff549e6516d9cc0b97: Status 404 returned error can't find the container with id e739fd1bb950ed904aac06445ed552967913e491844084ff549e6516d9cc0b97 Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.897939 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.910280 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.912528 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.914848 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-config\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.919962 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.920212 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.920332 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.920937 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-l26rp" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.921220 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwbdr\" (UniqueName: \"kubernetes.io/projected/67a83e3d-660c-40f0-893c-e8476053df0c-kube-api-access-wwbdr\") pod \"dnsmasq-dns-6f6f8cb849-klxrg\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.934435 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.934694 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.944600 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbxs\" (UniqueName: \"kubernetes.io/projected/3f721de0-e915-40f9-9444-f3135f39072c-kube-api-access-2zbxs\") pod \"neutron-db-sync-kxd22\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.959255 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:11:51 crc kubenswrapper[4789]: I1216 07:11:51.983472 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.088613 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b792l"] Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095166 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095247 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-config-data\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095267 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095287 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095317 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-scripts\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095336 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095365 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-logs\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.095408 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl2tq\" (UniqueName: \"kubernetes.io/projected/df63fbb9-0acf-4bfe-9380-63b8dc15928a-kube-api-access-wl2tq\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: W1216 07:11:52.184924 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode60ab97a_b1b7_4e70_87ca_efe218d234ce.slice/crio-1c513860bf8170844abe9592e8aaf209a97fb6d8d6d3178560eea1e038616851 WatchSource:0}: Error finding container 1c513860bf8170844abe9592e8aaf209a97fb6d8d6d3178560eea1e038616851: Status 404 returned error can't find the container with id 1c513860bf8170844abe9592e8aaf209a97fb6d8d6d3178560eea1e038616851 Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.197750 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl2tq\" (UniqueName: \"kubernetes.io/projected/df63fbb9-0acf-4bfe-9380-63b8dc15928a-kube-api-access-wl2tq\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.197795 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.197859 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-config-data\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.197879 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.197899 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.197962 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-scripts\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.197988 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.198025 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-logs\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.198725 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.199252 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.199532 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-logs\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.202363 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kxd22" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.213466 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-scripts\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.215569 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.217133 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.218865 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.218999 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.222594 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-config-data\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.226232 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.231858 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.261010 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.261235 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl2tq\" (UniqueName: \"kubernetes.io/projected/df63fbb9-0acf-4bfe-9380-63b8dc15928a-kube-api-access-wl2tq\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.285743 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.287900 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.299717 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-logs\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.299781 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.299888 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.299992 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.300063 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.300226 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.300287 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sc2x\" (UniqueName: \"kubernetes.io/projected/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-kube-api-access-9sc2x\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.300326 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.403643 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404054 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sc2x\" (UniqueName: \"kubernetes.io/projected/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-kube-api-access-9sc2x\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404085 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404158 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-logs\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404195 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404231 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404272 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404315 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.404808 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-logs\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.405296 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.405616 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.409012 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bk864"] Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.413656 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.419507 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.420696 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.422436 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.426108 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sc2x\" (UniqueName: \"kubernetes.io/projected/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-kube-api-access-9sc2x\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: W1216 07:11:52.430370 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c0faa76_d0ec_471c_bf3a_dfb991f4dfc6.slice/crio-ce942095dd7a42a7e3fc0e4be23674e1224757fed8d512b2399e11ce4c0cf379 WatchSource:0}: Error finding container ce942095dd7a42a7e3fc0e4be23674e1224757fed8d512b2399e11ce4c0cf379: Status 404 returned error can't find the container with id ce942095dd7a42a7e3fc0e4be23674e1224757fed8d512b2399e11ce4c0cf379 Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.439241 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.533555 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-glqgh"] Dec 16 07:11:52 crc kubenswrapper[4789]: W1216 07:11:52.534299 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d632824_4eaa_4698_b244_88872be244b8.slice/crio-0fc09c07457664e105fb6e74e7cac1e6c65164b829014cb4ff717e710a3d0a1d WatchSource:0}: Error finding container 0fc09c07457664e105fb6e74e7cac1e6c65164b829014cb4ff717e710a3d0a1d: Status 404 returned error can't find the container with id 0fc09c07457664e105fb6e74e7cac1e6c65164b829014cb4ff717e710a3d0a1d Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.543185 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.552691 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jvbz"] Dec 16 07:11:52 crc kubenswrapper[4789]: W1216 07:11:52.566985 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4284636e_4d98_4efc_a75a_18eada4a3a8d.slice/crio-0910006e2935b138781d36be01f7586d5cf3c5805ab115454c42bf3274b40c75 WatchSource:0}: Error finding container 0910006e2935b138781d36be01f7586d5cf3c5805ab115454c42bf3274b40c75: Status 404 returned error can't find the container with id 0910006e2935b138781d36be01f7586d5cf3c5805ab115454c42bf3274b40c75 Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.576159 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerStarted","Data":"0097c64eb556baa829cec9e4983384719a7a28c6f2473e41cf397c4f23a462ee"} Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.582890 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-glqgh" event={"ID":"5d632824-4eaa-4698-b244-88872be244b8","Type":"ContainerStarted","Data":"0fc09c07457664e105fb6e74e7cac1e6c65164b829014cb4ff717e710a3d0a1d"} Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.589120 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" event={"ID":"3459a829-6581-4ef6-a90e-342e4f3a138e","Type":"ContainerStarted","Data":"e739fd1bb950ed904aac06445ed552967913e491844084ff549e6516d9cc0b97"} Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.590157 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bk864" event={"ID":"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6","Type":"ContainerStarted","Data":"ce942095dd7a42a7e3fc0e4be23674e1224757fed8d512b2399e11ce4c0cf379"} Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.590826 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b792l" event={"ID":"e60ab97a-b1b7-4e70-87ca-efe218d234ce","Type":"ContainerStarted","Data":"1c513860bf8170844abe9592e8aaf209a97fb6d8d6d3178560eea1e038616851"} Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.632764 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.701367 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-klxrg"] Dec 16 07:11:52 crc kubenswrapper[4789]: W1216 07:11:52.732282 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67a83e3d_660c_40f0_893c_e8476053df0c.slice/crio-0490f3075ca5794c7e1b9e2d1610d36f9197ee9160e22b8f24a8033881d6ccd2 WatchSource:0}: Error finding container 0490f3075ca5794c7e1b9e2d1610d36f9197ee9160e22b8f24a8033881d6ccd2: Status 404 returned error can't find the container with id 0490f3075ca5794c7e1b9e2d1610d36f9197ee9160e22b8f24a8033881d6ccd2 Dec 16 07:11:52 crc kubenswrapper[4789]: I1216 07:11:52.787397 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kxd22"] Dec 16 07:11:53 crc kubenswrapper[4789]: I1216 07:11:53.342524 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:11:53 crc kubenswrapper[4789]: I1216 07:11:53.363453 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:11:53 crc kubenswrapper[4789]: I1216 07:11:53.409031 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:11:53 crc kubenswrapper[4789]: I1216 07:11:53.598501 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kxd22" event={"ID":"3f721de0-e915-40f9-9444-f3135f39072c","Type":"ContainerStarted","Data":"4d6187a2edab5e9a96506e38b7e727ac56bb99ef20e626298b88f9e06bba0a6d"} Dec 16 07:11:53 crc kubenswrapper[4789]: I1216 07:11:53.600494 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jvbz" event={"ID":"4284636e-4d98-4efc-a75a-18eada4a3a8d","Type":"ContainerStarted","Data":"0910006e2935b138781d36be01f7586d5cf3c5805ab115454c42bf3274b40c75"} Dec 16 07:11:53 crc kubenswrapper[4789]: I1216 07:11:53.601516 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" event={"ID":"67a83e3d-660c-40f0-893c-e8476053df0c","Type":"ContainerStarted","Data":"0490f3075ca5794c7e1b9e2d1610d36f9197ee9160e22b8f24a8033881d6ccd2"} Dec 16 07:11:55 crc kubenswrapper[4789]: I1216 07:11:55.765886 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.425538 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:11:56 crc kubenswrapper[4789]: W1216 07:11:56.430989 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod518a95d6_4a82_499a_9e43_6f1fbbe18c8b.slice/crio-7fbd96872ac10b5115a826f1914c1cb3add275f84d932263d756a4d135d4db33 WatchSource:0}: Error finding container 7fbd96872ac10b5115a826f1914c1cb3add275f84d932263d756a4d135d4db33: Status 404 returned error can't find the container with id 7fbd96872ac10b5115a826f1914c1cb3add275f84d932263d756a4d135d4db33 Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.634950 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df63fbb9-0acf-4bfe-9380-63b8dc15928a","Type":"ContainerStarted","Data":"55712e4de2f5f6644b3513014db69a11912f5a2b5f3058476a347e27a22ad867"} Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.635345 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df63fbb9-0acf-4bfe-9380-63b8dc15928a","Type":"ContainerStarted","Data":"4f346820d684858bb824eee2ea1c0af048979cb268fd997ec8736069988852c7"} Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.636647 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kxd22" event={"ID":"3f721de0-e915-40f9-9444-f3135f39072c","Type":"ContainerStarted","Data":"c8b668331c6026beabbd783213a89f766a203e33aae00ba9b1d79a7de6730e9f"} Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.642802 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b792l" event={"ID":"e60ab97a-b1b7-4e70-87ca-efe218d234ce","Type":"ContainerStarted","Data":"a4c8b978fe12bb63bc957bf77935159302b7af3ef11194e3a4ad5ee214f0ad07"} Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.645369 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"518a95d6-4a82-499a-9e43-6f1fbbe18c8b","Type":"ContainerStarted","Data":"7fbd96872ac10b5115a826f1914c1cb3add275f84d932263d756a4d135d4db33"} Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.649562 4789 generic.go:334] "Generic (PLEG): container finished" podID="3459a829-6581-4ef6-a90e-342e4f3a138e" containerID="b34291ec5fab3690ca9ab759c69dd255e4f8be9a1fda2177f24de514e3047b1e" exitCode=0 Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.651057 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" event={"ID":"3459a829-6581-4ef6-a90e-342e4f3a138e","Type":"ContainerDied","Data":"b34291ec5fab3690ca9ab759c69dd255e4f8be9a1fda2177f24de514e3047b1e"} Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.654107 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kxd22" podStartSLOduration=5.654090138 podStartE2EDuration="5.654090138s" podCreationTimestamp="2025-12-16 07:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:11:56.652494069 +0000 UTC m=+1254.914381698" watchObservedRunningTime="2025-12-16 07:11:56.654090138 +0000 UTC m=+1254.915977777" Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.662731 4789 generic.go:334] "Generic (PLEG): container finished" podID="67a83e3d-660c-40f0-893c-e8476053df0c" containerID="a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8" exitCode=0 Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.662782 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" event={"ID":"67a83e3d-660c-40f0-893c-e8476053df0c","Type":"ContainerDied","Data":"a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8"} Dec 16 07:11:56 crc kubenswrapper[4789]: I1216 07:11:56.699059 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b792l" podStartSLOduration=6.699038263 podStartE2EDuration="6.699038263s" podCreationTimestamp="2025-12-16 07:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:11:56.693631561 +0000 UTC m=+1254.955519210" watchObservedRunningTime="2025-12-16 07:11:56.699038263 +0000 UTC m=+1254.960925892" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.130069 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.185009 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-sb\") pod \"3459a829-6581-4ef6-a90e-342e4f3a138e\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.185308 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-swift-storage-0\") pod \"3459a829-6581-4ef6-a90e-342e4f3a138e\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.185392 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjn8p\" (UniqueName: \"kubernetes.io/projected/3459a829-6581-4ef6-a90e-342e4f3a138e-kube-api-access-tjn8p\") pod \"3459a829-6581-4ef6-a90e-342e4f3a138e\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.185408 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-config\") pod \"3459a829-6581-4ef6-a90e-342e4f3a138e\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.185503 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-svc\") pod \"3459a829-6581-4ef6-a90e-342e4f3a138e\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.185531 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-nb\") pod \"3459a829-6581-4ef6-a90e-342e4f3a138e\" (UID: \"3459a829-6581-4ef6-a90e-342e4f3a138e\") " Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.217475 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3459a829-6581-4ef6-a90e-342e4f3a138e-kube-api-access-tjn8p" (OuterVolumeSpecName: "kube-api-access-tjn8p") pod "3459a829-6581-4ef6-a90e-342e4f3a138e" (UID: "3459a829-6581-4ef6-a90e-342e4f3a138e"). InnerVolumeSpecName "kube-api-access-tjn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.292684 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjn8p\" (UniqueName: \"kubernetes.io/projected/3459a829-6581-4ef6-a90e-342e4f3a138e-kube-api-access-tjn8p\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.330784 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-config" (OuterVolumeSpecName: "config") pod "3459a829-6581-4ef6-a90e-342e4f3a138e" (UID: "3459a829-6581-4ef6-a90e-342e4f3a138e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.345146 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3459a829-6581-4ef6-a90e-342e4f3a138e" (UID: "3459a829-6581-4ef6-a90e-342e4f3a138e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.374024 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3459a829-6581-4ef6-a90e-342e4f3a138e" (UID: "3459a829-6581-4ef6-a90e-342e4f3a138e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.394347 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.394843 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.394862 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.394884 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3459a829-6581-4ef6-a90e-342e4f3a138e" (UID: "3459a829-6581-4ef6-a90e-342e4f3a138e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.398141 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3459a829-6581-4ef6-a90e-342e4f3a138e" (UID: "3459a829-6581-4ef6-a90e-342e4f3a138e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.496617 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.497006 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3459a829-6581-4ef6-a90e-342e4f3a138e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.676056 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" event={"ID":"3459a829-6581-4ef6-a90e-342e4f3a138e","Type":"ContainerDied","Data":"e739fd1bb950ed904aac06445ed552967913e491844084ff549e6516d9cc0b97"} Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.676114 4789 scope.go:117] "RemoveContainer" containerID="b34291ec5fab3690ca9ab759c69dd255e4f8be9a1fda2177f24de514e3047b1e" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.676163 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdbfbc95f-q5tbh" Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.749223 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-q5tbh"] Dec 16 07:11:57 crc kubenswrapper[4789]: I1216 07:11:57.755608 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdbfbc95f-q5tbh"] Dec 16 07:11:58 crc kubenswrapper[4789]: I1216 07:11:58.115943 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3459a829-6581-4ef6-a90e-342e4f3a138e" path="/var/lib/kubelet/pods/3459a829-6581-4ef6-a90e-342e4f3a138e/volumes" Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.737484 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"518a95d6-4a82-499a-9e43-6f1fbbe18c8b","Type":"ContainerStarted","Data":"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215"} Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.739849 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" event={"ID":"67a83e3d-660c-40f0-893c-e8476053df0c","Type":"ContainerStarted","Data":"a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778"} Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.739940 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.750504 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df63fbb9-0acf-4bfe-9380-63b8dc15928a","Type":"ContainerStarted","Data":"ba4d342fe0c4bf0e70b5de42e24ce31de188606fb237399574fd28d7399e2ad6"} Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.750593 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-log" containerID="cri-o://55712e4de2f5f6644b3513014db69a11912f5a2b5f3058476a347e27a22ad867" gracePeriod=30 Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.750648 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-httpd" containerID="cri-o://ba4d342fe0c4bf0e70b5de42e24ce31de188606fb237399574fd28d7399e2ad6" gracePeriod=30 Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.758787 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" podStartSLOduration=11.758764725 podStartE2EDuration="11.758764725s" podCreationTimestamp="2025-12-16 07:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:02.758020857 +0000 UTC m=+1261.019908496" watchObservedRunningTime="2025-12-16 07:12:02.758764725 +0000 UTC m=+1261.020652354" Dec 16 07:12:02 crc kubenswrapper[4789]: I1216 07:12:02.786751 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.786716186 podStartE2EDuration="12.786716186s" podCreationTimestamp="2025-12-16 07:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:02.782125744 +0000 UTC m=+1261.044013393" watchObservedRunningTime="2025-12-16 07:12:02.786716186 +0000 UTC m=+1261.048603825" Dec 16 07:12:03 crc kubenswrapper[4789]: I1216 07:12:03.760443 4789 generic.go:334] "Generic (PLEG): container finished" podID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerID="55712e4de2f5f6644b3513014db69a11912f5a2b5f3058476a347e27a22ad867" exitCode=143 Dec 16 07:12:03 crc kubenswrapper[4789]: I1216 07:12:03.760514 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df63fbb9-0acf-4bfe-9380-63b8dc15928a","Type":"ContainerDied","Data":"55712e4de2f5f6644b3513014db69a11912f5a2b5f3058476a347e27a22ad867"} Dec 16 07:12:04 crc kubenswrapper[4789]: I1216 07:12:04.772292 4789 generic.go:334] "Generic (PLEG): container finished" podID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerID="ba4d342fe0c4bf0e70b5de42e24ce31de188606fb237399574fd28d7399e2ad6" exitCode=0 Dec 16 07:12:04 crc kubenswrapper[4789]: I1216 07:12:04.772337 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df63fbb9-0acf-4bfe-9380-63b8dc15928a","Type":"ContainerDied","Data":"ba4d342fe0c4bf0e70b5de42e24ce31de188606fb237399574fd28d7399e2ad6"} Dec 16 07:12:06 crc kubenswrapper[4789]: I1216 07:12:06.792320 4789 generic.go:334] "Generic (PLEG): container finished" podID="e60ab97a-b1b7-4e70-87ca-efe218d234ce" containerID="a4c8b978fe12bb63bc957bf77935159302b7af3ef11194e3a4ad5ee214f0ad07" exitCode=0 Dec 16 07:12:06 crc kubenswrapper[4789]: I1216 07:12:06.792402 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b792l" event={"ID":"e60ab97a-b1b7-4e70-87ca-efe218d234ce","Type":"ContainerDied","Data":"a4c8b978fe12bb63bc957bf77935159302b7af3ef11194e3a4ad5ee214f0ad07"} Dec 16 07:12:06 crc kubenswrapper[4789]: I1216 07:12:06.987984 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:12:07 crc kubenswrapper[4789]: I1216 07:12:07.057596 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-6vb5v"] Dec 16 07:12:07 crc kubenswrapper[4789]: I1216 07:12:07.058004 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="dnsmasq-dns" containerID="cri-o://97d4496cdda1fcdfd06fca5f84fc0d6ebd0f02ae717d8b73550760003cfa5c08" gracePeriod=10 Dec 16 07:12:07 crc kubenswrapper[4789]: I1216 07:12:07.249205 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Dec 16 07:12:07 crc kubenswrapper[4789]: I1216 07:12:07.807492 4789 generic.go:334] "Generic (PLEG): container finished" podID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerID="97d4496cdda1fcdfd06fca5f84fc0d6ebd0f02ae717d8b73550760003cfa5c08" exitCode=0 Dec 16 07:12:07 crc kubenswrapper[4789]: I1216 07:12:07.807699 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" event={"ID":"81bd104f-cabd-425e-960d-32a7c8f65d4d","Type":"ContainerDied","Data":"97d4496cdda1fcdfd06fca5f84fc0d6ebd0f02ae717d8b73550760003cfa5c08"} Dec 16 07:12:12 crc kubenswrapper[4789]: I1216 07:12:12.249136 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Dec 16 07:12:17 crc kubenswrapper[4789]: I1216 07:12:17.249392 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Dec 16 07:12:17 crc kubenswrapper[4789]: I1216 07:12:17.250347 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:12:17 crc kubenswrapper[4789]: E1216 07:12:17.529946 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b" Dec 16 07:12:17 crc kubenswrapper[4789]: E1216 07:12:17.530114 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8qc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-bk864_openstack(5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:12:17 crc kubenswrapper[4789]: E1216 07:12:17.531325 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-bk864" podUID="5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" Dec 16 07:12:17 crc kubenswrapper[4789]: E1216 07:12:17.892842 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b\\\"\"" pod="openstack/placement-db-sync-bk864" podUID="5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" Dec 16 07:12:20 crc kubenswrapper[4789]: E1216 07:12:20.817254 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Dec 16 07:12:20 crc kubenswrapper[4789]: E1216 07:12:20.817862 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56gtt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-glqgh_openstack(5d632824-4eaa-4698-b244-88872be244b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:12:20 crc kubenswrapper[4789]: E1216 07:12:20.819546 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-glqgh" podUID="5d632824-4eaa-4698-b244-88872be244b8" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.839812 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b792l" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.858276 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.916689 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.916725 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df63fbb9-0acf-4bfe-9380-63b8dc15928a","Type":"ContainerDied","Data":"4f346820d684858bb824eee2ea1c0af048979cb268fd997ec8736069988852c7"} Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.916788 4789 scope.go:117] "RemoveContainer" containerID="ba4d342fe0c4bf0e70b5de42e24ce31de188606fb237399574fd28d7399e2ad6" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.918996 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b792l" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.919256 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b792l" event={"ID":"e60ab97a-b1b7-4e70-87ca-efe218d234ce","Type":"ContainerDied","Data":"1c513860bf8170844abe9592e8aaf209a97fb6d8d6d3178560eea1e038616851"} Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.919289 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c513860bf8170844abe9592e8aaf209a97fb6d8d6d3178560eea1e038616851" Dec 16 07:12:20 crc kubenswrapper[4789]: E1216 07:12:20.920222 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-glqgh" podUID="5d632824-4eaa-4698-b244-88872be244b8" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968518 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-credential-keys\") pod \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968587 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-logs\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968632 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-scripts\") pod \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968664 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968689 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl2tq\" (UniqueName: \"kubernetes.io/projected/df63fbb9-0acf-4bfe-9380-63b8dc15928a-kube-api-access-wl2tq\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968743 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-httpd-run\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968800 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh884\" (UniqueName: \"kubernetes.io/projected/e60ab97a-b1b7-4e70-87ca-efe218d234ce-kube-api-access-zh884\") pod \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968836 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-fernet-keys\") pod \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968855 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-combined-ca-bundle\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.968901 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-public-tls-certs\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.969007 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-config-data\") pod \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.969043 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-config-data\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.969087 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-combined-ca-bundle\") pod \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\" (UID: \"e60ab97a-b1b7-4e70-87ca-efe218d234ce\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.969155 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-scripts\") pod \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\" (UID: \"df63fbb9-0acf-4bfe-9380-63b8dc15928a\") " Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.969417 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.969621 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.969649 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-logs" (OuterVolumeSpecName: "logs") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.977221 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-scripts" (OuterVolumeSpecName: "scripts") pod "e60ab97a-b1b7-4e70-87ca-efe218d234ce" (UID: "e60ab97a-b1b7-4e70-87ca-efe218d234ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.977244 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60ab97a-b1b7-4e70-87ca-efe218d234ce-kube-api-access-zh884" (OuterVolumeSpecName: "kube-api-access-zh884") pod "e60ab97a-b1b7-4e70-87ca-efe218d234ce" (UID: "e60ab97a-b1b7-4e70-87ca-efe218d234ce"). InnerVolumeSpecName "kube-api-access-zh884". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.977269 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df63fbb9-0acf-4bfe-9380-63b8dc15928a-kube-api-access-wl2tq" (OuterVolumeSpecName: "kube-api-access-wl2tq") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "kube-api-access-wl2tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.978266 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e60ab97a-b1b7-4e70-87ca-efe218d234ce" (UID: "e60ab97a-b1b7-4e70-87ca-efe218d234ce"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.978815 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.991858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-scripts" (OuterVolumeSpecName: "scripts") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.993950 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e60ab97a-b1b7-4e70-87ca-efe218d234ce" (UID: "e60ab97a-b1b7-4e70-87ca-efe218d234ce"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:20 crc kubenswrapper[4789]: I1216 07:12:20.999582 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-config-data" (OuterVolumeSpecName: "config-data") pod "e60ab97a-b1b7-4e70-87ca-efe218d234ce" (UID: "e60ab97a-b1b7-4e70-87ca-efe218d234ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.008247 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.018312 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e60ab97a-b1b7-4e70-87ca-efe218d234ce" (UID: "e60ab97a-b1b7-4e70-87ca-efe218d234ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.020104 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-config-data" (OuterVolumeSpecName: "config-data") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.033670 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "df63fbb9-0acf-4bfe-9380-63b8dc15928a" (UID: "df63fbb9-0acf-4bfe-9380-63b8dc15928a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072301 4789 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072343 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072358 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072371 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072382 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072391 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072403 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df63fbb9-0acf-4bfe-9380-63b8dc15928a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072413 4789 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072424 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df63fbb9-0acf-4bfe-9380-63b8dc15928a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072433 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60ab97a-b1b7-4e70-87ca-efe218d234ce-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072443 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl2tq\" (UniqueName: \"kubernetes.io/projected/df63fbb9-0acf-4bfe-9380-63b8dc15928a-kube-api-access-wl2tq\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072473 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.072484 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh884\" (UniqueName: \"kubernetes.io/projected/e60ab97a-b1b7-4e70-87ca-efe218d234ce-kube-api-access-zh884\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.092771 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.173498 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.227842 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.228030 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n676hbdh6ch56fh58h5bh58h86h94h5b5h654h585h54dh65bh586h58bh575h5h687h54h5bbh68h558h58bh598h556h569hb4h694h67bh5c6h69q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmfn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ea93c850-0d3d-42f5-9e00-340ea2398cdd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.264154 4789 scope.go:117] "RemoveContainer" containerID="55712e4de2f5f6644b3513014db69a11912f5a2b5f3058476a347e27a22ad867" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.276977 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.295004 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.305269 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326129 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.326530 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="dnsmasq-dns" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326555 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="dnsmasq-dns" Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.326575 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3459a829-6581-4ef6-a90e-342e4f3a138e" containerName="init" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326584 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3459a829-6581-4ef6-a90e-342e4f3a138e" containerName="init" Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.326596 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="init" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326603 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="init" Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.326626 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60ab97a-b1b7-4e70-87ca-efe218d234ce" containerName="keystone-bootstrap" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326634 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60ab97a-b1b7-4e70-87ca-efe218d234ce" containerName="keystone-bootstrap" Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.326645 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-log" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326652 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-log" Dec 16 07:12:21 crc kubenswrapper[4789]: E1216 07:12:21.326662 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-httpd" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326670 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-httpd" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326872 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3459a829-6581-4ef6-a90e-342e4f3a138e" containerName="init" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326886 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60ab97a-b1b7-4e70-87ca-efe218d234ce" containerName="keystone-bootstrap" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326898 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-httpd" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326928 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" containerName="dnsmasq-dns" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.326943 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" containerName="glance-log" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.328391 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.330893 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.330966 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.344445 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.376151 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-config\") pod \"81bd104f-cabd-425e-960d-32a7c8f65d4d\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.376260 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v859p\" (UniqueName: \"kubernetes.io/projected/81bd104f-cabd-425e-960d-32a7c8f65d4d-kube-api-access-v859p\") pod \"81bd104f-cabd-425e-960d-32a7c8f65d4d\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.376344 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-swift-storage-0\") pod \"81bd104f-cabd-425e-960d-32a7c8f65d4d\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.376370 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-sb\") pod \"81bd104f-cabd-425e-960d-32a7c8f65d4d\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.376442 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-svc\") pod \"81bd104f-cabd-425e-960d-32a7c8f65d4d\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.376464 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-nb\") pod \"81bd104f-cabd-425e-960d-32a7c8f65d4d\" (UID: \"81bd104f-cabd-425e-960d-32a7c8f65d4d\") " Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.382691 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81bd104f-cabd-425e-960d-32a7c8f65d4d-kube-api-access-v859p" (OuterVolumeSpecName: "kube-api-access-v859p") pod "81bd104f-cabd-425e-960d-32a7c8f65d4d" (UID: "81bd104f-cabd-425e-960d-32a7c8f65d4d"). InnerVolumeSpecName "kube-api-access-v859p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.470723 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81bd104f-cabd-425e-960d-32a7c8f65d4d" (UID: "81bd104f-cabd-425e-960d-32a7c8f65d4d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479597 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479702 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-logs\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479736 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479760 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb64k\" (UniqueName: \"kubernetes.io/projected/bb49c501-7035-44f5-8db1-b88552df2500-kube-api-access-bb64k\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479786 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479806 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479829 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479843 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479881 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v859p\" (UniqueName: \"kubernetes.io/projected/81bd104f-cabd-425e-960d-32a7c8f65d4d-kube-api-access-v859p\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.479891 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.495478 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-config" (OuterVolumeSpecName: "config") pod "81bd104f-cabd-425e-960d-32a7c8f65d4d" (UID: "81bd104f-cabd-425e-960d-32a7c8f65d4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.513561 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81bd104f-cabd-425e-960d-32a7c8f65d4d" (UID: "81bd104f-cabd-425e-960d-32a7c8f65d4d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.565401 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "81bd104f-cabd-425e-960d-32a7c8f65d4d" (UID: "81bd104f-cabd-425e-960d-32a7c8f65d4d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.567274 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81bd104f-cabd-425e-960d-32a7c8f65d4d" (UID: "81bd104f-cabd-425e-960d-32a7c8f65d4d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.584870 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.584950 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.584972 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585014 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585105 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-logs\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585148 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585189 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb64k\" (UniqueName: \"kubernetes.io/projected/bb49c501-7035-44f5-8db1-b88552df2500-kube-api-access-bb64k\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585227 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585287 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585303 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585316 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.585329 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81bd104f-cabd-425e-960d-32a7c8f65d4d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.591279 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.591335 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.591554 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-logs\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.603930 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.608770 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.608885 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.612632 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.647609 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb64k\" (UniqueName: \"kubernetes.io/projected/bb49c501-7035-44f5-8db1-b88552df2500-kube-api-access-bb64k\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.691071 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.918990 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b792l"] Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.929090 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b792l"] Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.929550 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.929589 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.929627 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.930349 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9c7a67d0b05df89259805e04a44c28da359f8954db5a37cfc842fbdb4aa2e7a"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.930432 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://a9c7a67d0b05df89259805e04a44c28da359f8954db5a37cfc842fbdb4aa2e7a" gracePeriod=600 Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.933988 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.934017 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dfc89d77-6vb5v" event={"ID":"81bd104f-cabd-425e-960d-32a7c8f65d4d","Type":"ContainerDied","Data":"66cbee8e02ca161764a8c175feaa5ba269650cc9922962c1ec547c3e4420705f"} Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.934083 4789 scope.go:117] "RemoveContainer" containerID="97d4496cdda1fcdfd06fca5f84fc0d6ebd0f02ae717d8b73550760003cfa5c08" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.942503 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jvbz" event={"ID":"4284636e-4d98-4efc-a75a-18eada4a3a8d","Type":"ContainerStarted","Data":"f41be4360c65f86b476a62a1224e82bdbdfd4163db8a0afd3bb6ffc86b640b24"} Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.944860 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"518a95d6-4a82-499a-9e43-6f1fbbe18c8b","Type":"ContainerStarted","Data":"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40"} Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.945008 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-log" containerID="cri-o://7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215" gracePeriod=30 Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.945232 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-httpd" containerID="cri-o://632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40" gracePeriod=30 Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.946862 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.960804 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9jvbz" podStartSLOduration=2.269343121 podStartE2EDuration="30.96077971s" podCreationTimestamp="2025-12-16 07:11:51 +0000 UTC" firstStartedPulling="2025-12-16 07:11:52.572704748 +0000 UTC m=+1250.834592377" lastFinishedPulling="2025-12-16 07:12:21.264141337 +0000 UTC m=+1279.526028966" observedRunningTime="2025-12-16 07:12:21.957327596 +0000 UTC m=+1280.219215225" watchObservedRunningTime="2025-12-16 07:12:21.96077971 +0000 UTC m=+1280.222667339" Dec 16 07:12:21 crc kubenswrapper[4789]: I1216 07:12:21.961746 4789 scope.go:117] "RemoveContainer" containerID="ec2363db1b2e9737a3936198f633f4a6a9b92a07c065753b24d608fba64be02a" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.019824 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.019801238 podStartE2EDuration="31.019801238s" podCreationTimestamp="2025-12-16 07:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:21.980047879 +0000 UTC m=+1280.241935538" watchObservedRunningTime="2025-12-16 07:12:22.019801238 +0000 UTC m=+1280.281688867" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.057025 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-6vb5v"] Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.090661 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dfc89d77-6vb5v"] Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.123123 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81bd104f-cabd-425e-960d-32a7c8f65d4d" path="/var/lib/kubelet/pods/81bd104f-cabd-425e-960d-32a7c8f65d4d/volumes" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.123764 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df63fbb9-0acf-4bfe-9380-63b8dc15928a" path="/var/lib/kubelet/pods/df63fbb9-0acf-4bfe-9380-63b8dc15928a/volumes" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.124572 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e60ab97a-b1b7-4e70-87ca-efe218d234ce" path="/var/lib/kubelet/pods/e60ab97a-b1b7-4e70-87ca-efe218d234ce/volumes" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.125694 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vm9j2"] Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.126852 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vm9j2"] Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.129011 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.131101 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.132160 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.132302 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.132470 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2cc6f" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.132577 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.199846 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-combined-ca-bundle\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.200239 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww8fq\" (UniqueName: \"kubernetes.io/projected/deca8e15-b233-4a6c-bc1e-06494fca64bb-kube-api-access-ww8fq\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.200619 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-fernet-keys\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.200708 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-config-data\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.201132 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-scripts\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.201206 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-credential-keys\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.303715 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-fernet-keys\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.303760 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-config-data\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.303791 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-scripts\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.303814 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-credential-keys\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.303866 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-combined-ca-bundle\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.303900 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww8fq\" (UniqueName: \"kubernetes.io/projected/deca8e15-b233-4a6c-bc1e-06494fca64bb-kube-api-access-ww8fq\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.310516 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-fernet-keys\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.310537 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-config-data\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.312366 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-credential-keys\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.323178 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-scripts\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.323407 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-combined-ca-bundle\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.326248 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww8fq\" (UniqueName: \"kubernetes.io/projected/deca8e15-b233-4a6c-bc1e-06494fca64bb-kube-api-access-ww8fq\") pod \"keystone-bootstrap-vm9j2\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.572258 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.579058 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710334 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-config-data\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710409 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-combined-ca-bundle\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710453 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-scripts\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710498 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sc2x\" (UniqueName: \"kubernetes.io/projected/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-kube-api-access-9sc2x\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710535 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710557 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-internal-tls-certs\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710641 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-httpd-run\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.710672 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-logs\") pod \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\" (UID: \"518a95d6-4a82-499a-9e43-6f1fbbe18c8b\") " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.712981 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-logs" (OuterVolumeSpecName: "logs") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.713731 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.719739 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.722479 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-scripts" (OuterVolumeSpecName: "scripts") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.738688 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-kube-api-access-9sc2x" (OuterVolumeSpecName: "kube-api-access-9sc2x") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "kube-api-access-9sc2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.764670 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.769528 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.784378 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-config-data" (OuterVolumeSpecName: "config-data") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.814217 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.814247 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.814260 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.814275 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sc2x\" (UniqueName: \"kubernetes.io/projected/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-kube-api-access-9sc2x\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.814307 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.814322 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.814334 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.817737 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "518a95d6-4a82-499a-9e43-6f1fbbe18c8b" (UID: "518a95d6-4a82-499a-9e43-6f1fbbe18c8b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.840828 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.916291 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.916783 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/518a95d6-4a82-499a-9e43-6f1fbbe18c8b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.959265 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb49c501-7035-44f5-8db1-b88552df2500","Type":"ContainerStarted","Data":"da9c800d0fad1846bc7f03eb8cb6b0f6d90f7f01539b349d36fff38674a8a963"} Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.962854 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="a9c7a67d0b05df89259805e04a44c28da359f8954db5a37cfc842fbdb4aa2e7a" exitCode=0 Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.962957 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"a9c7a67d0b05df89259805e04a44c28da359f8954db5a37cfc842fbdb4aa2e7a"} Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.962986 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"c8393873a978af7e8e2aad1167caa21ec29d5fd3e46fb65f45bf1708f741ab20"} Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.963008 4789 scope.go:117] "RemoveContainer" containerID="b5498247db061c67566479b4544d243bb1272801b3a301b0847cb7fdd1e323de" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.989390 4789 generic.go:334] "Generic (PLEG): container finished" podID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerID="632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40" exitCode=143 Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.989421 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.989427 4789 generic.go:334] "Generic (PLEG): container finished" podID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerID="7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215" exitCode=143 Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.989456 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"518a95d6-4a82-499a-9e43-6f1fbbe18c8b","Type":"ContainerDied","Data":"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40"} Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.989501 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"518a95d6-4a82-499a-9e43-6f1fbbe18c8b","Type":"ContainerDied","Data":"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215"} Dec 16 07:12:22 crc kubenswrapper[4789]: I1216 07:12:22.989513 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"518a95d6-4a82-499a-9e43-6f1fbbe18c8b","Type":"ContainerDied","Data":"7fbd96872ac10b5115a826f1914c1cb3add275f84d932263d756a4d135d4db33"} Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.039657 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.054191 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.071742 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:12:23 crc kubenswrapper[4789]: E1216 07:12:23.072110 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-httpd" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.072127 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-httpd" Dec 16 07:12:23 crc kubenswrapper[4789]: E1216 07:12:23.072142 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-log" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.072148 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-log" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.072440 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-log" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.072461 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" containerName="glance-httpd" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.073472 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.083623 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.100118 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.100372 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.187942 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vm9j2"] Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222207 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52z2f\" (UniqueName: \"kubernetes.io/projected/b0633f9f-8819-4d01-8925-3c09e214c5f3-kube-api-access-52z2f\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222298 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222326 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222346 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222382 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222428 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222450 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.222506 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324222 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52z2f\" (UniqueName: \"kubernetes.io/projected/b0633f9f-8819-4d01-8925-3c09e214c5f3-kube-api-access-52z2f\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324291 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324320 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324338 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324378 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324428 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324461 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324496 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324741 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.324865 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.325615 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.332841 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.333339 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.334470 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.338594 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.343772 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52z2f\" (UniqueName: \"kubernetes.io/projected/b0633f9f-8819-4d01-8925-3c09e214c5f3-kube-api-access-52z2f\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.361503 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:12:23 crc kubenswrapper[4789]: I1216 07:12:23.412954 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:24 crc kubenswrapper[4789]: I1216 07:12:24.117228 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="518a95d6-4a82-499a-9e43-6f1fbbe18c8b" path="/var/lib/kubelet/pods/518a95d6-4a82-499a-9e43-6f1fbbe18c8b/volumes" Dec 16 07:12:24 crc kubenswrapper[4789]: W1216 07:12:24.830529 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeca8e15_b233_4a6c_bc1e_06494fca64bb.slice/crio-44dff939d888cac0ff4996300d6f866e5aa913e51968dc051c5bff40fdf15066 WatchSource:0}: Error finding container 44dff939d888cac0ff4996300d6f866e5aa913e51968dc051c5bff40fdf15066: Status 404 returned error can't find the container with id 44dff939d888cac0ff4996300d6f866e5aa913e51968dc051c5bff40fdf15066 Dec 16 07:12:24 crc kubenswrapper[4789]: I1216 07:12:24.986538 4789 scope.go:117] "RemoveContainer" containerID="632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.013836 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm9j2" event={"ID":"deca8e15-b233-4a6c-bc1e-06494fca64bb","Type":"ContainerStarted","Data":"44dff939d888cac0ff4996300d6f866e5aa913e51968dc051c5bff40fdf15066"} Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.015662 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb49c501-7035-44f5-8db1-b88552df2500","Type":"ContainerStarted","Data":"3499b6119f9e2351dc1c72dc30e3b362986e2dc30e9b09646c3b984e0c3ed2a8"} Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.059200 4789 scope.go:117] "RemoveContainer" containerID="7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.088736 4789 scope.go:117] "RemoveContainer" containerID="632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40" Dec 16 07:12:25 crc kubenswrapper[4789]: E1216 07:12:25.090944 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40\": container with ID starting with 632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40 not found: ID does not exist" containerID="632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.091006 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40"} err="failed to get container status \"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40\": rpc error: code = NotFound desc = could not find container \"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40\": container with ID starting with 632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40 not found: ID does not exist" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.091187 4789 scope.go:117] "RemoveContainer" containerID="7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215" Dec 16 07:12:25 crc kubenswrapper[4789]: E1216 07:12:25.091767 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215\": container with ID starting with 7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215 not found: ID does not exist" containerID="7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.091802 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215"} err="failed to get container status \"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215\": rpc error: code = NotFound desc = could not find container \"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215\": container with ID starting with 7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215 not found: ID does not exist" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.091816 4789 scope.go:117] "RemoveContainer" containerID="632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.092470 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40"} err="failed to get container status \"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40\": rpc error: code = NotFound desc = could not find container \"632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40\": container with ID starting with 632e2f152ca4a54340aaf9a2939717e0f69492b568389636e63472117acbca40 not found: ID does not exist" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.092526 4789 scope.go:117] "RemoveContainer" containerID="7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.093023 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215"} err="failed to get container status \"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215\": rpc error: code = NotFound desc = could not find container \"7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215\": container with ID starting with 7be428cdb4810fd3ea40e06e017b790e6943cd58fe38fc7c28b18667d067d215 not found: ID does not exist" Dec 16 07:12:25 crc kubenswrapper[4789]: I1216 07:12:25.356027 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:12:25 crc kubenswrapper[4789]: W1216 07:12:25.358123 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0633f9f_8819_4d01_8925_3c09e214c5f3.slice/crio-1a1fe6c8c3f99ebed9d8e0ccae9aedd4686e5dc9ee14d536a2efb37443998eb7 WatchSource:0}: Error finding container 1a1fe6c8c3f99ebed9d8e0ccae9aedd4686e5dc9ee14d536a2efb37443998eb7: Status 404 returned error can't find the container with id 1a1fe6c8c3f99ebed9d8e0ccae9aedd4686e5dc9ee14d536a2efb37443998eb7 Dec 16 07:12:26 crc kubenswrapper[4789]: I1216 07:12:26.041024 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb49c501-7035-44f5-8db1-b88552df2500","Type":"ContainerStarted","Data":"0df976f34b34ac9975bb7a34f8e6afbcddb2a22bcfd17a9c19f5b63f0b00cbda"} Dec 16 07:12:26 crc kubenswrapper[4789]: I1216 07:12:26.058594 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm9j2" event={"ID":"deca8e15-b233-4a6c-bc1e-06494fca64bb","Type":"ContainerStarted","Data":"0b0240b70cddddde386248f7aa63d1008a41744df7cabf3cdb8893cad6a13888"} Dec 16 07:12:26 crc kubenswrapper[4789]: I1216 07:12:26.074411 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerStarted","Data":"c4db002ce0efdc2cb144b4904f1736f5dc17ac20f562094058a1e415e546010c"} Dec 16 07:12:26 crc kubenswrapper[4789]: I1216 07:12:26.090394 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0633f9f-8819-4d01-8925-3c09e214c5f3","Type":"ContainerStarted","Data":"61ee4f5f23aa653cfe8438c1bb953359d5aad0e0cc33f763adc23e8bb55b4325"} Dec 16 07:12:26 crc kubenswrapper[4789]: I1216 07:12:26.090439 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0633f9f-8819-4d01-8925-3c09e214c5f3","Type":"ContainerStarted","Data":"1a1fe6c8c3f99ebed9d8e0ccae9aedd4686e5dc9ee14d536a2efb37443998eb7"} Dec 16 07:12:26 crc kubenswrapper[4789]: I1216 07:12:26.095530 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.09551132 podStartE2EDuration="5.09551132s" podCreationTimestamp="2025-12-16 07:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:26.076853405 +0000 UTC m=+1284.338741034" watchObservedRunningTime="2025-12-16 07:12:26.09551132 +0000 UTC m=+1284.357398949" Dec 16 07:12:26 crc kubenswrapper[4789]: I1216 07:12:26.102265 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vm9j2" podStartSLOduration=4.102245063 podStartE2EDuration="4.102245063s" podCreationTimestamp="2025-12-16 07:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:26.101376713 +0000 UTC m=+1284.363264342" watchObservedRunningTime="2025-12-16 07:12:26.102245063 +0000 UTC m=+1284.364132692" Dec 16 07:12:27 crc kubenswrapper[4789]: I1216 07:12:27.106694 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0633f9f-8819-4d01-8925-3c09e214c5f3","Type":"ContainerStarted","Data":"6b837c35d386080c7160b250bba62c98d1d6538584b87e84bdf27aa41539aeb3"} Dec 16 07:12:27 crc kubenswrapper[4789]: I1216 07:12:27.113010 4789 generic.go:334] "Generic (PLEG): container finished" podID="4284636e-4d98-4efc-a75a-18eada4a3a8d" containerID="f41be4360c65f86b476a62a1224e82bdbdfd4163db8a0afd3bb6ffc86b640b24" exitCode=0 Dec 16 07:12:27 crc kubenswrapper[4789]: I1216 07:12:27.113063 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jvbz" event={"ID":"4284636e-4d98-4efc-a75a-18eada4a3a8d","Type":"ContainerDied","Data":"f41be4360c65f86b476a62a1224e82bdbdfd4163db8a0afd3bb6ffc86b640b24"} Dec 16 07:12:27 crc kubenswrapper[4789]: I1216 07:12:27.141223 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.141207147 podStartE2EDuration="4.141207147s" podCreationTimestamp="2025-12-16 07:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:27.139348722 +0000 UTC m=+1285.401236351" watchObservedRunningTime="2025-12-16 07:12:27.141207147 +0000 UTC m=+1285.403094776" Dec 16 07:12:31 crc kubenswrapper[4789]: I1216 07:12:31.947043 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:12:31 crc kubenswrapper[4789]: I1216 07:12:31.947666 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:12:31 crc kubenswrapper[4789]: I1216 07:12:31.992445 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:12:32 crc kubenswrapper[4789]: I1216 07:12:32.004450 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:12:32 crc kubenswrapper[4789]: I1216 07:12:32.171070 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:12:32 crc kubenswrapper[4789]: I1216 07:12:32.171418 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:12:33 crc kubenswrapper[4789]: I1216 07:12:33.413688 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:33 crc kubenswrapper[4789]: I1216 07:12:33.413757 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:33 crc kubenswrapper[4789]: I1216 07:12:33.446679 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:33 crc kubenswrapper[4789]: I1216 07:12:33.458396 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:34 crc kubenswrapper[4789]: I1216 07:12:34.120459 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:12:34 crc kubenswrapper[4789]: I1216 07:12:34.185970 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:12:34 crc kubenswrapper[4789]: I1216 07:12:34.186352 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:34 crc kubenswrapper[4789]: I1216 07:12:34.186397 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:34 crc kubenswrapper[4789]: I1216 07:12:34.189568 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:12:35 crc kubenswrapper[4789]: I1216 07:12:35.204870 4789 generic.go:334] "Generic (PLEG): container finished" podID="deca8e15-b233-4a6c-bc1e-06494fca64bb" containerID="0b0240b70cddddde386248f7aa63d1008a41744df7cabf3cdb8893cad6a13888" exitCode=0 Dec 16 07:12:35 crc kubenswrapper[4789]: I1216 07:12:35.205669 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm9j2" event={"ID":"deca8e15-b233-4a6c-bc1e-06494fca64bb","Type":"ContainerDied","Data":"0b0240b70cddddde386248f7aa63d1008a41744df7cabf3cdb8893cad6a13888"} Dec 16 07:12:36 crc kubenswrapper[4789]: I1216 07:12:36.213105 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:36 crc kubenswrapper[4789]: I1216 07:12:36.213450 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:12:36 crc kubenswrapper[4789]: I1216 07:12:36.221336 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.100317 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.108491 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172534 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww8fq\" (UniqueName: \"kubernetes.io/projected/deca8e15-b233-4a6c-bc1e-06494fca64bb-kube-api-access-ww8fq\") pod \"deca8e15-b233-4a6c-bc1e-06494fca64bb\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172575 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-db-sync-config-data\") pod \"4284636e-4d98-4efc-a75a-18eada4a3a8d\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172607 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kvjt\" (UniqueName: \"kubernetes.io/projected/4284636e-4d98-4efc-a75a-18eada4a3a8d-kube-api-access-2kvjt\") pod \"4284636e-4d98-4efc-a75a-18eada4a3a8d\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172651 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-config-data\") pod \"deca8e15-b233-4a6c-bc1e-06494fca64bb\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172757 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-fernet-keys\") pod \"deca8e15-b233-4a6c-bc1e-06494fca64bb\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172786 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-scripts\") pod \"deca8e15-b233-4a6c-bc1e-06494fca64bb\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172822 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-combined-ca-bundle\") pod \"4284636e-4d98-4efc-a75a-18eada4a3a8d\" (UID: \"4284636e-4d98-4efc-a75a-18eada4a3a8d\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172851 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-combined-ca-bundle\") pod \"deca8e15-b233-4a6c-bc1e-06494fca64bb\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.172903 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-credential-keys\") pod \"deca8e15-b233-4a6c-bc1e-06494fca64bb\" (UID: \"deca8e15-b233-4a6c-bc1e-06494fca64bb\") " Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.182494 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-scripts" (OuterVolumeSpecName: "scripts") pod "deca8e15-b233-4a6c-bc1e-06494fca64bb" (UID: "deca8e15-b233-4a6c-bc1e-06494fca64bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.183340 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4284636e-4d98-4efc-a75a-18eada4a3a8d-kube-api-access-2kvjt" (OuterVolumeSpecName: "kube-api-access-2kvjt") pod "4284636e-4d98-4efc-a75a-18eada4a3a8d" (UID: "4284636e-4d98-4efc-a75a-18eada4a3a8d"). InnerVolumeSpecName "kube-api-access-2kvjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.183434 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4284636e-4d98-4efc-a75a-18eada4a3a8d" (UID: "4284636e-4d98-4efc-a75a-18eada4a3a8d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.184006 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "deca8e15-b233-4a6c-bc1e-06494fca64bb" (UID: "deca8e15-b233-4a6c-bc1e-06494fca64bb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.185747 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "deca8e15-b233-4a6c-bc1e-06494fca64bb" (UID: "deca8e15-b233-4a6c-bc1e-06494fca64bb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.188198 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deca8e15-b233-4a6c-bc1e-06494fca64bb-kube-api-access-ww8fq" (OuterVolumeSpecName: "kube-api-access-ww8fq") pod "deca8e15-b233-4a6c-bc1e-06494fca64bb" (UID: "deca8e15-b233-4a6c-bc1e-06494fca64bb"). InnerVolumeSpecName "kube-api-access-ww8fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.216382 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-config-data" (OuterVolumeSpecName: "config-data") pod "deca8e15-b233-4a6c-bc1e-06494fca64bb" (UID: "deca8e15-b233-4a6c-bc1e-06494fca64bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.224604 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jvbz" event={"ID":"4284636e-4d98-4efc-a75a-18eada4a3a8d","Type":"ContainerDied","Data":"0910006e2935b138781d36be01f7586d5cf3c5805ab115454c42bf3274b40c75"} Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.224815 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0910006e2935b138781d36be01f7586d5cf3c5805ab115454c42bf3274b40c75" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.224624 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jvbz" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.229168 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm9j2" event={"ID":"deca8e15-b233-4a6c-bc1e-06494fca64bb","Type":"ContainerDied","Data":"44dff939d888cac0ff4996300d6f866e5aa913e51968dc051c5bff40fdf15066"} Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.229216 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44dff939d888cac0ff4996300d6f866e5aa913e51968dc051c5bff40fdf15066" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.231071 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm9j2" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.233996 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4284636e-4d98-4efc-a75a-18eada4a3a8d" (UID: "4284636e-4d98-4efc-a75a-18eada4a3a8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.237619 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deca8e15-b233-4a6c-bc1e-06494fca64bb" (UID: "deca8e15-b233-4a6c-bc1e-06494fca64bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282100 4789 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282142 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282154 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282165 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282176 4789 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282191 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww8fq\" (UniqueName: \"kubernetes.io/projected/deca8e15-b233-4a6c-bc1e-06494fca64bb-kube-api-access-ww8fq\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282203 4789 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4284636e-4d98-4efc-a75a-18eada4a3a8d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282214 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kvjt\" (UniqueName: \"kubernetes.io/projected/4284636e-4d98-4efc-a75a-18eada4a3a8d-kube-api-access-2kvjt\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.282224 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deca8e15-b233-4a6c-bc1e-06494fca64bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.363559 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6789db9888-57dmq"] Dec 16 07:12:37 crc kubenswrapper[4789]: E1216 07:12:37.363942 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4284636e-4d98-4efc-a75a-18eada4a3a8d" containerName="barbican-db-sync" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.363959 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4284636e-4d98-4efc-a75a-18eada4a3a8d" containerName="barbican-db-sync" Dec 16 07:12:37 crc kubenswrapper[4789]: E1216 07:12:37.363972 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deca8e15-b233-4a6c-bc1e-06494fca64bb" containerName="keystone-bootstrap" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.363978 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca8e15-b233-4a6c-bc1e-06494fca64bb" containerName="keystone-bootstrap" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.364145 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="deca8e15-b233-4a6c-bc1e-06494fca64bb" containerName="keystone-bootstrap" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.364166 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4284636e-4d98-4efc-a75a-18eada4a3a8d" containerName="barbican-db-sync" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.364669 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.369003 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.369119 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.378137 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6789db9888-57dmq"] Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388351 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6pff\" (UniqueName: \"kubernetes.io/projected/254f667d-eae3-486b-b9e8-ffc571d65635-kube-api-access-k6pff\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388397 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-public-tls-certs\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388440 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-credential-keys\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388482 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-config-data\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388531 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-combined-ca-bundle\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388553 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-internal-tls-certs\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388589 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-fernet-keys\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.388607 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-scripts\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.507139 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-combined-ca-bundle\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.507395 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-internal-tls-certs\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.507524 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-fernet-keys\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.507623 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-scripts\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.507829 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6pff\" (UniqueName: \"kubernetes.io/projected/254f667d-eae3-486b-b9e8-ffc571d65635-kube-api-access-k6pff\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.507959 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-public-tls-certs\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.508075 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-credential-keys\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.508201 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-config-data\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.519819 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-internal-tls-certs\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.525793 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-credential-keys\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.525825 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-combined-ca-bundle\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.527402 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-public-tls-certs\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.527629 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-config-data\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.527643 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-fernet-keys\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.528340 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-scripts\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.532339 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6pff\" (UniqueName: \"kubernetes.io/projected/254f667d-eae3-486b-b9e8-ffc571d65635-kube-api-access-k6pff\") pod \"keystone-6789db9888-57dmq\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:37 crc kubenswrapper[4789]: I1216 07:12:37.683618 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.229378 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6789db9888-57dmq"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.275014 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bk864" event={"ID":"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6","Type":"ContainerStarted","Data":"6817790e0d358dcc4395f0de504a86d8b3b7db41eb13bb80735e722842fce735"} Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.286596 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerStarted","Data":"187778fe04c696d2f6a76319e9ef6f4337b035d13109fc7b2c401c2a09d3e938"} Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.330201 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-bk864" podStartSLOduration=2.107009136 podStartE2EDuration="47.330175602s" podCreationTimestamp="2025-12-16 07:11:51 +0000 UTC" firstStartedPulling="2025-12-16 07:11:52.433874166 +0000 UTC m=+1250.695761795" lastFinishedPulling="2025-12-16 07:12:37.657040632 +0000 UTC m=+1295.918928261" observedRunningTime="2025-12-16 07:12:38.313341903 +0000 UTC m=+1296.575229532" watchObservedRunningTime="2025-12-16 07:12:38.330175602 +0000 UTC m=+1296.592063231" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.469257 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.470751 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.473365 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.473387 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-km9xh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.479811 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.485104 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-864d99d789-mv5rh"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.486788 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.492875 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.517006 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-864d99d789-mv5rh"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.535209 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.576020 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8fffc8985-mdm7j"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.577567 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.612878 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8fffc8985-mdm7j"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.630959 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-combined-ca-bundle\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631012 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631045 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-872l5\" (UniqueName: \"kubernetes.io/projected/f00adc24-beed-43df-95a8-274b841d60a0-kube-api-access-872l5\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631075 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data-custom\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631142 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xr6l\" (UniqueName: \"kubernetes.io/projected/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-kube-api-access-8xr6l\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631205 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631226 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data-custom\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631260 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00adc24-beed-43df-95a8-274b841d60a0-logs\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631286 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-logs\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.631321 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-combined-ca-bundle\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.661235 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8f6654c7b-cm5sf"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.663137 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.669244 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.673181 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8f6654c7b-cm5sf"] Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.733855 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-nb\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.733937 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-872l5\" (UniqueName: \"kubernetes.io/projected/f00adc24-beed-43df-95a8-274b841d60a0-kube-api-access-872l5\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.733976 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data-custom\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734022 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-config\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734057 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-swift-storage-0\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xr6l\" (UniqueName: \"kubernetes.io/projected/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-kube-api-access-8xr6l\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734225 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt7sp\" (UniqueName: \"kubernetes.io/projected/0d472548-77d0-45dc-846c-b36f55c20d05-kube-api-access-vt7sp\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734259 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734274 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data-custom\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734301 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00adc24-beed-43df-95a8-274b841d60a0-logs\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734322 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-logs\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734337 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-combined-ca-bundle\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734353 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-sb\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734386 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-svc\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734417 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-combined-ca-bundle\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.734438 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.735785 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00adc24-beed-43df-95a8-274b841d60a0-logs\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.737813 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-logs\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.738581 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-combined-ca-bundle\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.738792 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data-custom\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.740579 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.741443 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.741467 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-combined-ca-bundle\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.749588 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data-custom\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.753276 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-872l5\" (UniqueName: \"kubernetes.io/projected/f00adc24-beed-43df-95a8-274b841d60a0-kube-api-access-872l5\") pod \"barbican-worker-864d99d789-mv5rh\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.760639 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xr6l\" (UniqueName: \"kubernetes.io/projected/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-kube-api-access-8xr6l\") pod \"barbican-keystone-listener-6cfddfd9f4-hmzjm\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.808665 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.824218 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.838075 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.838303 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-nb\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.838457 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-config\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.838569 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-swift-storage-0\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.838682 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9q7z\" (UniqueName: \"kubernetes.io/projected/765d73af-7fc6-49af-8c1e-e7558e7f5350-kube-api-access-f9q7z\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.838815 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt7sp\" (UniqueName: \"kubernetes.io/projected/0d472548-77d0-45dc-846c-b36f55c20d05-kube-api-access-vt7sp\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.838993 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data-custom\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.839146 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-sb\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.839268 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-combined-ca-bundle\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.839366 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-svc\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.839464 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/765d73af-7fc6-49af-8c1e-e7558e7f5350-logs\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.840507 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-nb\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.841976 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-sb\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.843063 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-config\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.843747 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-swift-storage-0\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.844340 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-svc\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.857763 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt7sp\" (UniqueName: \"kubernetes.io/projected/0d472548-77d0-45dc-846c-b36f55c20d05-kube-api-access-vt7sp\") pod \"dnsmasq-dns-8fffc8985-mdm7j\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.923865 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.942801 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data-custom\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.942974 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-combined-ca-bundle\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.943022 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/765d73af-7fc6-49af-8c1e-e7558e7f5350-logs\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.943057 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.943191 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9q7z\" (UniqueName: \"kubernetes.io/projected/765d73af-7fc6-49af-8c1e-e7558e7f5350-kube-api-access-f9q7z\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.948219 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/765d73af-7fc6-49af-8c1e-e7558e7f5350-logs\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.950583 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data-custom\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.953977 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.955415 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-combined-ca-bundle\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.969596 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9q7z\" (UniqueName: \"kubernetes.io/projected/765d73af-7fc6-49af-8c1e-e7558e7f5350-kube-api-access-f9q7z\") pod \"barbican-api-8f6654c7b-cm5sf\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:38 crc kubenswrapper[4789]: I1216 07:12:38.993416 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.388182 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm"] Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.413446 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6789db9888-57dmq" event={"ID":"254f667d-eae3-486b-b9e8-ffc571d65635","Type":"ContainerStarted","Data":"d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269"} Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.413496 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6789db9888-57dmq" event={"ID":"254f667d-eae3-486b-b9e8-ffc571d65635","Type":"ContainerStarted","Data":"2aa198f94427fbc639cd9718e61ee6d951f034ffbb159b6995767295359c5369"} Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.414599 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.445072 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-glqgh" event={"ID":"5d632824-4eaa-4698-b244-88872be244b8","Type":"ContainerStarted","Data":"d35da1999fd3dc35eaf2c9bde171ffdbd72680cf0a247a9b921b35018bb0d859"} Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.447691 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-864d99d789-mv5rh"] Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.536722 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6789db9888-57dmq" podStartSLOduration=2.5366990190000003 podStartE2EDuration="2.536699019s" podCreationTimestamp="2025-12-16 07:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:39.474661191 +0000 UTC m=+1297.736548820" watchObservedRunningTime="2025-12-16 07:12:39.536699019 +0000 UTC m=+1297.798586648" Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.537145 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-glqgh" podStartSLOduration=3.417565057 podStartE2EDuration="48.537141659s" podCreationTimestamp="2025-12-16 07:11:51 +0000 UTC" firstStartedPulling="2025-12-16 07:11:52.53747037 +0000 UTC m=+1250.799357999" lastFinishedPulling="2025-12-16 07:12:37.657046962 +0000 UTC m=+1295.918934601" observedRunningTime="2025-12-16 07:12:39.511895876 +0000 UTC m=+1297.773783505" watchObservedRunningTime="2025-12-16 07:12:39.537141659 +0000 UTC m=+1297.799029288" Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.591584 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8fffc8985-mdm7j"] Dec 16 07:12:39 crc kubenswrapper[4789]: I1216 07:12:39.883260 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8f6654c7b-cm5sf"] Dec 16 07:12:39 crc kubenswrapper[4789]: W1216 07:12:39.887483 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod765d73af_7fc6_49af_8c1e_e7558e7f5350.slice/crio-70112c2d4331dfccf9367ce3625a7cd7198abb4a46517f0a9d47e7e66f3b1748 WatchSource:0}: Error finding container 70112c2d4331dfccf9367ce3625a7cd7198abb4a46517f0a9d47e7e66f3b1748: Status 404 returned error can't find the container with id 70112c2d4331dfccf9367ce3625a7cd7198abb4a46517f0a9d47e7e66f3b1748 Dec 16 07:12:40 crc kubenswrapper[4789]: I1216 07:12:40.456587 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-864d99d789-mv5rh" event={"ID":"f00adc24-beed-43df-95a8-274b841d60a0","Type":"ContainerStarted","Data":"46a86ae055389e3e45af7fd0e66dfa93de8caa3ffd8d4abed4b008fc72618855"} Dec 16 07:12:40 crc kubenswrapper[4789]: I1216 07:12:40.458505 4789 generic.go:334] "Generic (PLEG): container finished" podID="0d472548-77d0-45dc-846c-b36f55c20d05" containerID="a982c1baac7a3603c6318a205667d362036ffc12d8dcc384c21001c00f1bcb52" exitCode=0 Dec 16 07:12:40 crc kubenswrapper[4789]: I1216 07:12:40.458568 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" event={"ID":"0d472548-77d0-45dc-846c-b36f55c20d05","Type":"ContainerDied","Data":"a982c1baac7a3603c6318a205667d362036ffc12d8dcc384c21001c00f1bcb52"} Dec 16 07:12:40 crc kubenswrapper[4789]: I1216 07:12:40.458615 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" event={"ID":"0d472548-77d0-45dc-846c-b36f55c20d05","Type":"ContainerStarted","Data":"0e24f62d7f83361d7342c0cdd24256d092a736eb696dcb4ddcfc298b3f05e9ae"} Dec 16 07:12:40 crc kubenswrapper[4789]: I1216 07:12:40.459632 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8f6654c7b-cm5sf" event={"ID":"765d73af-7fc6-49af-8c1e-e7558e7f5350","Type":"ContainerStarted","Data":"70112c2d4331dfccf9367ce3625a7cd7198abb4a46517f0a9d47e7e66f3b1748"} Dec 16 07:12:40 crc kubenswrapper[4789]: I1216 07:12:40.461365 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" event={"ID":"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0","Type":"ContainerStarted","Data":"bc3870b72ed3ba51a30fc640835cb553744960bf695d8ff62857e88a889b781d"} Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.654940 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-bbbc99994-cwbpw"] Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.656881 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.663578 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bbbc99994-cwbpw"] Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.665213 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.665498 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.822799 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-combined-ca-bundle\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.823242 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.823407 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-internal-tls-certs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.823528 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data-custom\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.823623 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-public-tls-certs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.823739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b0a572-437e-4d15-a74d-e92c0f39c9cc-logs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.823865 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29plf\" (UniqueName: \"kubernetes.io/projected/93b0a572-437e-4d15-a74d-e92c0f39c9cc-kube-api-access-29plf\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925098 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-internal-tls-certs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925169 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data-custom\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925197 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-public-tls-certs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925241 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b0a572-437e-4d15-a74d-e92c0f39c9cc-logs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925273 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29plf\" (UniqueName: \"kubernetes.io/projected/93b0a572-437e-4d15-a74d-e92c0f39c9cc-kube-api-access-29plf\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925341 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-combined-ca-bundle\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925397 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.925672 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b0a572-437e-4d15-a74d-e92c0f39c9cc-logs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.931359 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-internal-tls-certs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.932420 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-public-tls-certs\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.938087 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.938834 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-combined-ca-bundle\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.940501 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data-custom\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.943837 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29plf\" (UniqueName: \"kubernetes.io/projected/93b0a572-437e-4d15-a74d-e92c0f39c9cc-kube-api-access-29plf\") pod \"barbican-api-bbbc99994-cwbpw\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:41 crc kubenswrapper[4789]: I1216 07:12:41.975609 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:42 crc kubenswrapper[4789]: I1216 07:12:42.480779 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8f6654c7b-cm5sf" event={"ID":"765d73af-7fc6-49af-8c1e-e7558e7f5350","Type":"ContainerStarted","Data":"2e7281997b6da968c5c0f17931656533f9519f8f8bf9919d3f5f3ac881bce263"} Dec 16 07:12:42 crc kubenswrapper[4789]: I1216 07:12:42.556455 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bbbc99994-cwbpw"] Dec 16 07:12:43 crc kubenswrapper[4789]: I1216 07:12:43.488807 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" event={"ID":"0d472548-77d0-45dc-846c-b36f55c20d05","Type":"ContainerStarted","Data":"c41d925e9962e9941cadb03150aa4d3844326a4c3c3b013fef2f48da381dfd6e"} Dec 16 07:12:43 crc kubenswrapper[4789]: I1216 07:12:43.490509 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8f6654c7b-cm5sf" event={"ID":"765d73af-7fc6-49af-8c1e-e7558e7f5350","Type":"ContainerStarted","Data":"9bbe35556e1b3bea7379d840e0600ff1ce16e3961c559c5ea3950f67a8e60cb2"} Dec 16 07:12:43 crc kubenswrapper[4789]: I1216 07:12:43.492295 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbbc99994-cwbpw" event={"ID":"93b0a572-437e-4d15-a74d-e92c0f39c9cc","Type":"ContainerStarted","Data":"f71a167007b51e6b5519402191320729c426676f344ae45a6739ee1603881192"} Dec 16 07:12:43 crc kubenswrapper[4789]: I1216 07:12:43.492341 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbbc99994-cwbpw" event={"ID":"93b0a572-437e-4d15-a74d-e92c0f39c9cc","Type":"ContainerStarted","Data":"985ea34c0a9778039db9bb9bc7969fabdc2e50c353ec77704428e8eca67e386c"} Dec 16 07:12:44 crc kubenswrapper[4789]: I1216 07:12:44.502429 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbbc99994-cwbpw" event={"ID":"93b0a572-437e-4d15-a74d-e92c0f39c9cc","Type":"ContainerStarted","Data":"063de08125ee34cdd307d91af26a93c26622ecd506fc1ef247a55845563c91d4"} Dec 16 07:12:44 crc kubenswrapper[4789]: I1216 07:12:44.503088 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:44 crc kubenswrapper[4789]: I1216 07:12:44.503108 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:44 crc kubenswrapper[4789]: I1216 07:12:44.522121 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-bbbc99994-cwbpw" podStartSLOduration=3.5221040439999998 podStartE2EDuration="3.522104044s" podCreationTimestamp="2025-12-16 07:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:44.519424979 +0000 UTC m=+1302.781312608" watchObservedRunningTime="2025-12-16 07:12:44.522104044 +0000 UTC m=+1302.783991673" Dec 16 07:12:44 crc kubenswrapper[4789]: I1216 07:12:44.548645 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8f6654c7b-cm5sf" podStartSLOduration=6.548621568 podStartE2EDuration="6.548621568s" podCreationTimestamp="2025-12-16 07:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:44.542651134 +0000 UTC m=+1302.804538763" watchObservedRunningTime="2025-12-16 07:12:44.548621568 +0000 UTC m=+1302.810509197" Dec 16 07:12:44 crc kubenswrapper[4789]: I1216 07:12:44.564548 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" podStartSLOduration=6.564524194 podStartE2EDuration="6.564524194s" podCreationTimestamp="2025-12-16 07:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:44.559141214 +0000 UTC m=+1302.821028853" watchObservedRunningTime="2025-12-16 07:12:44.564524194 +0000 UTC m=+1302.826411833" Dec 16 07:12:45 crc kubenswrapper[4789]: I1216 07:12:45.511144 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:47 crc kubenswrapper[4789]: I1216 07:12:47.530822 4789 generic.go:334] "Generic (PLEG): container finished" podID="5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" containerID="6817790e0d358dcc4395f0de504a86d8b3b7db41eb13bb80735e722842fce735" exitCode=0 Dec 16 07:12:47 crc kubenswrapper[4789]: I1216 07:12:47.530906 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bk864" event={"ID":"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6","Type":"ContainerDied","Data":"6817790e0d358dcc4395f0de504a86d8b3b7db41eb13bb80735e722842fce735"} Dec 16 07:12:48 crc kubenswrapper[4789]: I1216 07:12:48.882707 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bk864" Dec 16 07:12:48 crc kubenswrapper[4789]: I1216 07:12:48.926144 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:12:48 crc kubenswrapper[4789]: E1216 07:12:48.977874 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" Dec 16 07:12:48 crc kubenswrapper[4789]: I1216 07:12:48.997041 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:48 crc kubenswrapper[4789]: I1216 07:12:48.997369 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.019472 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-klxrg"] Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.019695 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" podUID="67a83e3d-660c-40f0-893c-e8476053df0c" containerName="dnsmasq-dns" containerID="cri-o://a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778" gracePeriod=10 Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.073162 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-scripts\") pod \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.073452 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8qc7\" (UniqueName: \"kubernetes.io/projected/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-kube-api-access-b8qc7\") pod \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.073471 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-config-data\") pod \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.073504 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-combined-ca-bundle\") pod \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.073622 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-logs\") pod \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\" (UID: \"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.081377 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-logs" (OuterVolumeSpecName: "logs") pod "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" (UID: "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.082075 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-scripts" (OuterVolumeSpecName: "scripts") pod "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" (UID: "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.094309 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-kube-api-access-b8qc7" (OuterVolumeSpecName: "kube-api-access-b8qc7") pod "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" (UID: "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6"). InnerVolumeSpecName "kube-api-access-b8qc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.130051 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-config-data" (OuterVolumeSpecName: "config-data") pod "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" (UID: "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.141546 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" (UID: "5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.175805 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8qc7\" (UniqueName: \"kubernetes.io/projected/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-kube-api-access-b8qc7\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.175836 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.175845 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.175854 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.175865 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.497083 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.551817 4789 generic.go:334] "Generic (PLEG): container finished" podID="5d632824-4eaa-4698-b244-88872be244b8" containerID="d35da1999fd3dc35eaf2c9bde171ffdbd72680cf0a247a9b921b35018bb0d859" exitCode=0 Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.551881 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-glqgh" event={"ID":"5d632824-4eaa-4698-b244-88872be244b8","Type":"ContainerDied","Data":"d35da1999fd3dc35eaf2c9bde171ffdbd72680cf0a247a9b921b35018bb0d859"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.554454 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-864d99d789-mv5rh" event={"ID":"f00adc24-beed-43df-95a8-274b841d60a0","Type":"ContainerStarted","Data":"0f48a94f282f9b6fdc9bae2f55acd33cf0b6397237ba430e41c42e3e2660b0b4"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.554490 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-864d99d789-mv5rh" event={"ID":"f00adc24-beed-43df-95a8-274b841d60a0","Type":"ContainerStarted","Data":"305b96e7f12f126be8501fb24906a6e570466b444e3bf1c2ee9a66e30d5add7f"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.573868 4789 generic.go:334] "Generic (PLEG): container finished" podID="67a83e3d-660c-40f0-893c-e8476053df0c" containerID="a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778" exitCode=0 Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.574127 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" event={"ID":"67a83e3d-660c-40f0-893c-e8476053df0c","Type":"ContainerDied","Data":"a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.574205 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" event={"ID":"67a83e3d-660c-40f0-893c-e8476053df0c","Type":"ContainerDied","Data":"0490f3075ca5794c7e1b9e2d1610d36f9197ee9160e22b8f24a8033881d6ccd2"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.574276 4789 scope.go:117] "RemoveContainer" containerID="a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.574449 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6f8cb849-klxrg" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.577274 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bk864" event={"ID":"5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6","Type":"ContainerDied","Data":"ce942095dd7a42a7e3fc0e4be23674e1224757fed8d512b2399e11ce4c0cf379"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.577358 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce942095dd7a42a7e3fc0e4be23674e1224757fed8d512b2399e11ce4c0cf379" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.577451 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bk864" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.599694 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerStarted","Data":"3b016492e16aa0b6eb0cf520a0e6405b67a5e116d7ebd06f8c3f3f1f3defa394"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.600343 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="ceilometer-notification-agent" containerID="cri-o://c4db002ce0efdc2cb144b4904f1736f5dc17ac20f562094058a1e415e546010c" gracePeriod=30 Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.600643 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.600944 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="proxy-httpd" containerID="cri-o://3b016492e16aa0b6eb0cf520a0e6405b67a5e116d7ebd06f8c3f3f1f3defa394" gracePeriod=30 Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.601049 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="sg-core" containerID="cri-o://187778fe04c696d2f6a76319e9ef6f4337b035d13109fc7b2c401c2a09d3e938" gracePeriod=30 Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.614645 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.614762 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.615959 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" event={"ID":"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0","Type":"ContainerStarted","Data":"2339da9c2a0e96f20cb674ffa23240a66f38f9520dccf9e45eba3c48effd6316"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.616063 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" event={"ID":"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0","Type":"ContainerStarted","Data":"38cb122304e1fdaa474b979a230d2c0c3d7a6825b41760a62e2b6923a3837070"} Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.620541 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-864d99d789-mv5rh" podStartSLOduration=2.620045423 podStartE2EDuration="11.620522006s" podCreationTimestamp="2025-12-16 07:12:38 +0000 UTC" firstStartedPulling="2025-12-16 07:12:39.51125612 +0000 UTC m=+1297.773143749" lastFinishedPulling="2025-12-16 07:12:48.511732703 +0000 UTC m=+1306.773620332" observedRunningTime="2025-12-16 07:12:49.588827786 +0000 UTC m=+1307.850715415" watchObservedRunningTime="2025-12-16 07:12:49.620522006 +0000 UTC m=+1307.882409635" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.621320 4789 scope.go:117] "RemoveContainer" containerID="a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.670010 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" podStartSLOduration=2.556921309 podStartE2EDuration="11.669987127s" podCreationTimestamp="2025-12-16 07:12:38 +0000 UTC" firstStartedPulling="2025-12-16 07:12:39.401646878 +0000 UTC m=+1297.663534507" lastFinishedPulling="2025-12-16 07:12:48.514712696 +0000 UTC m=+1306.776600325" observedRunningTime="2025-12-16 07:12:49.666177194 +0000 UTC m=+1307.928064833" watchObservedRunningTime="2025-12-16 07:12:49.669987127 +0000 UTC m=+1307.931874746" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.690316 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-nb\") pod \"67a83e3d-660c-40f0-893c-e8476053df0c\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.690563 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-config\") pod \"67a83e3d-660c-40f0-893c-e8476053df0c\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.690671 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-svc\") pod \"67a83e3d-660c-40f0-893c-e8476053df0c\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.690804 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwbdr\" (UniqueName: \"kubernetes.io/projected/67a83e3d-660c-40f0-893c-e8476053df0c-kube-api-access-wwbdr\") pod \"67a83e3d-660c-40f0-893c-e8476053df0c\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.690900 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-swift-storage-0\") pod \"67a83e3d-660c-40f0-893c-e8476053df0c\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.691035 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-sb\") pod \"67a83e3d-660c-40f0-893c-e8476053df0c\" (UID: \"67a83e3d-660c-40f0-893c-e8476053df0c\") " Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.693093 4789 scope.go:117] "RemoveContainer" containerID="a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778" Dec 16 07:12:49 crc kubenswrapper[4789]: E1216 07:12:49.699602 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778\": container with ID starting with a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778 not found: ID does not exist" containerID="a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.699659 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778"} err="failed to get container status \"a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778\": rpc error: code = NotFound desc = could not find container \"a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778\": container with ID starting with a6d1e6d867490fbde3c83ac374999c556b6e4229f10b10a0790defc35c056778 not found: ID does not exist" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.699691 4789 scope.go:117] "RemoveContainer" containerID="a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8" Dec 16 07:12:49 crc kubenswrapper[4789]: E1216 07:12:49.703059 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8\": container with ID starting with a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8 not found: ID does not exist" containerID="a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.703104 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8"} err="failed to get container status \"a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8\": rpc error: code = NotFound desc = could not find container \"a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8\": container with ID starting with a9a1206a5928973f8adceca994f33eba307cafd92db010f6996c8201b1b838a8 not found: ID does not exist" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.716895 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a83e3d-660c-40f0-893c-e8476053df0c-kube-api-access-wwbdr" (OuterVolumeSpecName: "kube-api-access-wwbdr") pod "67a83e3d-660c-40f0-893c-e8476053df0c" (UID: "67a83e3d-660c-40f0-893c-e8476053df0c"). InnerVolumeSpecName "kube-api-access-wwbdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.767491 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67a83e3d-660c-40f0-893c-e8476053df0c" (UID: "67a83e3d-660c-40f0-893c-e8476053df0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.793234 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwbdr\" (UniqueName: \"kubernetes.io/projected/67a83e3d-660c-40f0-893c-e8476053df0c-kube-api-access-wwbdr\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.793269 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.830438 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b48fd45b4-hp2xw"] Dec 16 07:12:49 crc kubenswrapper[4789]: E1216 07:12:49.830887 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" containerName="placement-db-sync" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.830902 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" containerName="placement-db-sync" Dec 16 07:12:49 crc kubenswrapper[4789]: E1216 07:12:49.830947 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67a83e3d-660c-40f0-893c-e8476053df0c" containerName="dnsmasq-dns" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.830955 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a83e3d-660c-40f0-893c-e8476053df0c" containerName="dnsmasq-dns" Dec 16 07:12:49 crc kubenswrapper[4789]: E1216 07:12:49.830984 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67a83e3d-660c-40f0-893c-e8476053df0c" containerName="init" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.830990 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a83e3d-660c-40f0-893c-e8476053df0c" containerName="init" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.831201 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="67a83e3d-660c-40f0-893c-e8476053df0c" containerName="dnsmasq-dns" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.831229 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" containerName="placement-db-sync" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.849043 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.854295 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.854518 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.854683 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.874349 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nr77q" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.874750 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.917101 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b48fd45b4-hp2xw"] Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.951358 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-config" (OuterVolumeSpecName: "config") pod "67a83e3d-660c-40f0-893c-e8476053df0c" (UID: "67a83e3d-660c-40f0-893c-e8476053df0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.964764 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-config-data\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.965212 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lrv\" (UniqueName: \"kubernetes.io/projected/8368d044-b088-48f9-b5cb-19a95b997576-kube-api-access-48lrv\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.965316 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-combined-ca-bundle\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.965997 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8368d044-b088-48f9-b5cb-19a95b997576-logs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.966217 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-public-tls-certs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.966244 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-scripts\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.977551 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67a83e3d-660c-40f0-893c-e8476053df0c" (UID: "67a83e3d-660c-40f0-893c-e8476053df0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.979195 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "67a83e3d-660c-40f0-893c-e8476053df0c" (UID: "67a83e3d-660c-40f0-893c-e8476053df0c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.983359 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-internal-tls-certs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.983638 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.983655 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:49 crc kubenswrapper[4789]: I1216 07:12:49.983668 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.033490 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67a83e3d-660c-40f0-893c-e8476053df0c" (UID: "67a83e3d-660c-40f0-893c-e8476053df0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.085799 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-config-data\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.086114 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lrv\" (UniqueName: \"kubernetes.io/projected/8368d044-b088-48f9-b5cb-19a95b997576-kube-api-access-48lrv\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.086275 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-combined-ca-bundle\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.086409 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8368d044-b088-48f9-b5cb-19a95b997576-logs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.086585 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-public-tls-certs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.086735 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-scripts\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.087086 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-internal-tls-certs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.087179 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8368d044-b088-48f9-b5cb-19a95b997576-logs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.087327 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67a83e3d-660c-40f0-893c-e8476053df0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.091814 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-config-data\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.092364 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-combined-ca-bundle\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.096369 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-scripts\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.096955 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-internal-tls-certs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.097300 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-public-tls-certs\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.137490 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lrv\" (UniqueName: \"kubernetes.io/projected/8368d044-b088-48f9-b5cb-19a95b997576-kube-api-access-48lrv\") pod \"placement-b48fd45b4-hp2xw\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.203662 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-klxrg"] Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.210589 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f6f8cb849-klxrg"] Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.256866 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.627014 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerDied","Data":"3b016492e16aa0b6eb0cf520a0e6405b67a5e116d7ebd06f8c3f3f1f3defa394"} Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.626950 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerID="3b016492e16aa0b6eb0cf520a0e6405b67a5e116d7ebd06f8c3f3f1f3defa394" exitCode=0 Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.627367 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerID="187778fe04c696d2f6a76319e9ef6f4337b035d13109fc7b2c401c2a09d3e938" exitCode=2 Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.627470 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerDied","Data":"187778fe04c696d2f6a76319e9ef6f4337b035d13109fc7b2c401c2a09d3e938"} Dec 16 07:12:50 crc kubenswrapper[4789]: I1216 07:12:50.708241 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b48fd45b4-hp2xw"] Dec 16 07:12:50 crc kubenswrapper[4789]: W1216 07:12:50.725766 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8368d044_b088_48f9_b5cb_19a95b997576.slice/crio-368cc17b705bd079c2e4ff7f824a908affbcfc53863f91bea18600fda4e338e4 WatchSource:0}: Error finding container 368cc17b705bd079c2e4ff7f824a908affbcfc53863f91bea18600fda4e338e4: Status 404 returned error can't find the container with id 368cc17b705bd079c2e4ff7f824a908affbcfc53863f91bea18600fda4e338e4 Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:50.999770 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-glqgh" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.014572 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-combined-ca-bundle\") pod \"5d632824-4eaa-4698-b244-88872be244b8\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.014688 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d632824-4eaa-4698-b244-88872be244b8-etc-machine-id\") pod \"5d632824-4eaa-4698-b244-88872be244b8\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.014928 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56gtt\" (UniqueName: \"kubernetes.io/projected/5d632824-4eaa-4698-b244-88872be244b8-kube-api-access-56gtt\") pod \"5d632824-4eaa-4698-b244-88872be244b8\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.014980 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-db-sync-config-data\") pod \"5d632824-4eaa-4698-b244-88872be244b8\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.015018 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-scripts\") pod \"5d632824-4eaa-4698-b244-88872be244b8\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.015050 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-config-data\") pod \"5d632824-4eaa-4698-b244-88872be244b8\" (UID: \"5d632824-4eaa-4698-b244-88872be244b8\") " Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.016028 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d632824-4eaa-4698-b244-88872be244b8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5d632824-4eaa-4698-b244-88872be244b8" (UID: "5d632824-4eaa-4698-b244-88872be244b8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.025273 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d632824-4eaa-4698-b244-88872be244b8-kube-api-access-56gtt" (OuterVolumeSpecName: "kube-api-access-56gtt") pod "5d632824-4eaa-4698-b244-88872be244b8" (UID: "5d632824-4eaa-4698-b244-88872be244b8"). InnerVolumeSpecName "kube-api-access-56gtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.030054 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-scripts" (OuterVolumeSpecName: "scripts") pod "5d632824-4eaa-4698-b244-88872be244b8" (UID: "5d632824-4eaa-4698-b244-88872be244b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.030662 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5d632824-4eaa-4698-b244-88872be244b8" (UID: "5d632824-4eaa-4698-b244-88872be244b8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.082099 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d632824-4eaa-4698-b244-88872be244b8" (UID: "5d632824-4eaa-4698-b244-88872be244b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.110044 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-config-data" (OuterVolumeSpecName: "config-data") pod "5d632824-4eaa-4698-b244-88872be244b8" (UID: "5d632824-4eaa-4698-b244-88872be244b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.117738 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.117779 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.117794 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.117810 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d632824-4eaa-4698-b244-88872be244b8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.117823 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56gtt\" (UniqueName: \"kubernetes.io/projected/5d632824-4eaa-4698-b244-88872be244b8-kube-api-access-56gtt\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.117831 4789 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d632824-4eaa-4698-b244-88872be244b8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.183648 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.183800 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.652492 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b48fd45b4-hp2xw" event={"ID":"8368d044-b088-48f9-b5cb-19a95b997576","Type":"ContainerStarted","Data":"f58f590eff39129dc3fd6cbf997894d78a3061978019b25eafe3b8d013aa5949"} Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.652857 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b48fd45b4-hp2xw" event={"ID":"8368d044-b088-48f9-b5cb-19a95b997576","Type":"ContainerStarted","Data":"640501fd43f4fcce68155ec6cb24a721ad4ca1ea36ae7f97fe8a96f2974be91e"} Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.652872 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b48fd45b4-hp2xw" event={"ID":"8368d044-b088-48f9-b5cb-19a95b997576","Type":"ContainerStarted","Data":"368cc17b705bd079c2e4ff7f824a908affbcfc53863f91bea18600fda4e338e4"} Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.653223 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.653289 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.664370 4789 generic.go:334] "Generic (PLEG): container finished" podID="3f721de0-e915-40f9-9444-f3135f39072c" containerID="c8b668331c6026beabbd783213a89f766a203e33aae00ba9b1d79a7de6730e9f" exitCode=0 Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.664448 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kxd22" event={"ID":"3f721de0-e915-40f9-9444-f3135f39072c","Type":"ContainerDied","Data":"c8b668331c6026beabbd783213a89f766a203e33aae00ba9b1d79a7de6730e9f"} Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.682692 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b48fd45b4-hp2xw" podStartSLOduration=2.682667085 podStartE2EDuration="2.682667085s" podCreationTimestamp="2025-12-16 07:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:51.679615791 +0000 UTC m=+1309.941503420" watchObservedRunningTime="2025-12-16 07:12:51.682667085 +0000 UTC m=+1309.944554714" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.691090 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerID="c4db002ce0efdc2cb144b4904f1736f5dc17ac20f562094058a1e415e546010c" exitCode=0 Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.691165 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerDied","Data":"c4db002ce0efdc2cb144b4904f1736f5dc17ac20f562094058a1e415e546010c"} Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.701954 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-glqgh" event={"ID":"5d632824-4eaa-4698-b244-88872be244b8","Type":"ContainerDied","Data":"0fc09c07457664e105fb6e74e7cac1e6c65164b829014cb4ff717e710a3d0a1d"} Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.702000 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fc09c07457664e105fb6e74e7cac1e6c65164b829014cb4ff717e710a3d0a1d" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.702069 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-glqgh" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.929930 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:12:51 crc kubenswrapper[4789]: E1216 07:12:51.930402 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d632824-4eaa-4698-b244-88872be244b8" containerName="cinder-db-sync" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.930429 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d632824-4eaa-4698-b244-88872be244b8" containerName="cinder-db-sync" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.930670 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d632824-4eaa-4698-b244-88872be244b8" containerName="cinder-db-sync" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.931756 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.937602 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.937758 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nn726" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.937894 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.949560 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 07:12:51 crc kubenswrapper[4789]: I1216 07:12:51.958936 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.053431 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.053762 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.053836 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.053897 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q8qq\" (UniqueName: \"kubernetes.io/projected/b1a18c04-e865-4967-83e2-96198c2f99ee-kube-api-access-5q8qq\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.053943 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1a18c04-e865-4967-83e2-96198c2f99ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.053979 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.057506 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.155787 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-combined-ca-bundle\") pod \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.155987 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-scripts\") pod \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156064 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmfn7\" (UniqueName: \"kubernetes.io/projected/ea93c850-0d3d-42f5-9e00-340ea2398cdd-kube-api-access-qmfn7\") pod \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156099 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-run-httpd\") pod \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156145 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-config-data\") pod \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156226 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-log-httpd\") pod \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156331 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-sg-core-conf-yaml\") pod \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\" (UID: \"ea93c850-0d3d-42f5-9e00-340ea2398cdd\") " Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156749 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156849 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q8qq\" (UniqueName: \"kubernetes.io/projected/b1a18c04-e865-4967-83e2-96198c2f99ee-kube-api-access-5q8qq\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.156875 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1a18c04-e865-4967-83e2-96198c2f99ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.157186 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.157261 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.157292 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.159178 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea93c850-0d3d-42f5-9e00-340ea2398cdd" (UID: "ea93c850-0d3d-42f5-9e00-340ea2398cdd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.159488 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea93c850-0d3d-42f5-9e00-340ea2398cdd" (UID: "ea93c850-0d3d-42f5-9e00-340ea2398cdd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.163237 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1a18c04-e865-4967-83e2-96198c2f99ee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.197975 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.213069 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a83e3d-660c-40f0-893c-e8476053df0c" path="/var/lib/kubelet/pods/67a83e3d-660c-40f0-893c-e8476053df0c/volumes" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.221685 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.229292 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.247177 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea93c850-0d3d-42f5-9e00-340ea2398cdd-kube-api-access-qmfn7" (OuterVolumeSpecName: "kube-api-access-qmfn7") pod "ea93c850-0d3d-42f5-9e00-340ea2398cdd" (UID: "ea93c850-0d3d-42f5-9e00-340ea2398cdd"). InnerVolumeSpecName "kube-api-access-qmfn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.259048 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmfn7\" (UniqueName: \"kubernetes.io/projected/ea93c850-0d3d-42f5-9e00-340ea2398cdd-kube-api-access-qmfn7\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.259081 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.259090 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea93c850-0d3d-42f5-9e00-340ea2398cdd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.270305 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-scripts" (OuterVolumeSpecName: "scripts") pod "ea93c850-0d3d-42f5-9e00-340ea2398cdd" (UID: "ea93c850-0d3d-42f5-9e00-340ea2398cdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.270969 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q8qq\" (UniqueName: \"kubernetes.io/projected/b1a18c04-e865-4967-83e2-96198c2f99ee-kube-api-access-5q8qq\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.270757 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.337024 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.341071 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea93c850-0d3d-42f5-9e00-340ea2398cdd" (UID: "ea93c850-0d3d-42f5-9e00-340ea2398cdd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.363953 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d96cd6c9c-6p6rp"] Dec 16 07:12:52 crc kubenswrapper[4789]: E1216 07:12:52.364414 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="ceilometer-notification-agent" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.364436 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="ceilometer-notification-agent" Dec 16 07:12:52 crc kubenswrapper[4789]: E1216 07:12:52.364451 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="sg-core" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.364459 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="sg-core" Dec 16 07:12:52 crc kubenswrapper[4789]: E1216 07:12:52.364489 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="proxy-httpd" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.364496 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="proxy-httpd" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.364682 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="sg-core" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.364715 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="proxy-httpd" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.364736 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" containerName="ceilometer-notification-agent" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.365766 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d96cd6c9c-6p6rp"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.365792 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.369651 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.369684 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.371457 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.379514 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.379647 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.384109 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.408548 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea93c850-0d3d-42f5-9e00-340ea2398cdd" (UID: "ea93c850-0d3d-42f5-9e00-340ea2398cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.434522 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.470774 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-scripts\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.470815 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3be47c97-0d9a-49a2-bad8-8e0c331c1074-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.470840 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqdgn\" (UniqueName: \"kubernetes.io/projected/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-kube-api-access-dqdgn\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471053 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be47c97-0d9a-49a2-bad8-8e0c331c1074-logs\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471164 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471252 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-swift-storage-0\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471302 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-config\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471560 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471595 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwmzk\" (UniqueName: \"kubernetes.io/projected/3be47c97-0d9a-49a2-bad8-8e0c331c1074-kube-api-access-lwmzk\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471614 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-nb\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471630 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data-custom\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471649 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471671 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-svc\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.471730 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.504452 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-config-data" (OuterVolumeSpecName: "config-data") pod "ea93c850-0d3d-42f5-9e00-340ea2398cdd" (UID: "ea93c850-0d3d-42f5-9e00-340ea2398cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574205 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574273 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwmzk\" (UniqueName: \"kubernetes.io/projected/3be47c97-0d9a-49a2-bad8-8e0c331c1074-kube-api-access-lwmzk\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574302 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-nb\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574326 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data-custom\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574346 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574374 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-svc\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574416 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-scripts\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574442 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3be47c97-0d9a-49a2-bad8-8e0c331c1074-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574469 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqdgn\" (UniqueName: \"kubernetes.io/projected/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-kube-api-access-dqdgn\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574517 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be47c97-0d9a-49a2-bad8-8e0c331c1074-logs\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574550 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574574 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-swift-storage-0\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574595 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-config\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.574653 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea93c850-0d3d-42f5-9e00-340ea2398cdd-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.575562 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-config\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.576490 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.577707 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-nb\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.578402 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3be47c97-0d9a-49a2-bad8-8e0c331c1074-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.579374 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-svc\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.579680 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be47c97-0d9a-49a2-bad8-8e0c331c1074-logs\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.584832 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-swift-storage-0\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.595170 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.595694 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data-custom\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.600535 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-scripts\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.601142 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.603623 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqdgn\" (UniqueName: \"kubernetes.io/projected/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-kube-api-access-dqdgn\") pod \"dnsmasq-dns-6d96cd6c9c-6p6rp\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.610837 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwmzk\" (UniqueName: \"kubernetes.io/projected/3be47c97-0d9a-49a2-bad8-8e0c331c1074-kube-api-access-lwmzk\") pod \"cinder-api-0\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.728886 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.729076 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea93c850-0d3d-42f5-9e00-340ea2398cdd","Type":"ContainerDied","Data":"0097c64eb556baa829cec9e4983384719a7a28c6f2473e41cf397c4f23a462ee"} Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.729653 4789 scope.go:117] "RemoveContainer" containerID="3b016492e16aa0b6eb0cf520a0e6405b67a5e116d7ebd06f8c3f3f1f3defa394" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.760270 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.782176 4789 scope.go:117] "RemoveContainer" containerID="187778fe04c696d2f6a76319e9ef6f4337b035d13109fc7b2c401c2a09d3e938" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.798311 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.922008 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.940960 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.952991 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.973160 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.973275 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.982582 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:12:52 crc kubenswrapper[4789]: I1216 07:12:52.982899 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.031086 4789 scope.go:117] "RemoveContainer" containerID="c4db002ce0efdc2cb144b4904f1736f5dc17ac20f562094058a1e415e546010c" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.113328 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28s4\" (UniqueName: \"kubernetes.io/projected/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-kube-api-access-l28s4\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.113777 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-config-data\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.113807 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-scripts\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.113900 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.113959 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.113988 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-run-httpd\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.114059 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-log-httpd\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.174581 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.216479 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28s4\" (UniqueName: \"kubernetes.io/projected/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-kube-api-access-l28s4\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.216584 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-config-data\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.216613 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-scripts\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.216702 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.216759 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.216794 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-run-httpd\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.216876 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-log-httpd\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.217589 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-log-httpd\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.219190 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-run-httpd\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.225408 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-scripts\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.232778 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.239452 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.245869 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-config-data\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.264304 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28s4\" (UniqueName: \"kubernetes.io/projected/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-kube-api-access-l28s4\") pod \"ceilometer-0\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.307042 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.483786 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kxd22" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.640579 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-config\") pod \"3f721de0-e915-40f9-9444-f3135f39072c\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.640681 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-combined-ca-bundle\") pod \"3f721de0-e915-40f9-9444-f3135f39072c\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.640705 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zbxs\" (UniqueName: \"kubernetes.io/projected/3f721de0-e915-40f9-9444-f3135f39072c-kube-api-access-2zbxs\") pod \"3f721de0-e915-40f9-9444-f3135f39072c\" (UID: \"3f721de0-e915-40f9-9444-f3135f39072c\") " Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.661194 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f721de0-e915-40f9-9444-f3135f39072c-kube-api-access-2zbxs" (OuterVolumeSpecName: "kube-api-access-2zbxs") pod "3f721de0-e915-40f9-9444-f3135f39072c" (UID: "3f721de0-e915-40f9-9444-f3135f39072c"). InnerVolumeSpecName "kube-api-access-2zbxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.743408 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zbxs\" (UniqueName: \"kubernetes.io/projected/3f721de0-e915-40f9-9444-f3135f39072c-kube-api-access-2zbxs\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.776259 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f721de0-e915-40f9-9444-f3135f39072c" (UID: "3f721de0-e915-40f9-9444-f3135f39072c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.788315 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1a18c04-e865-4967-83e2-96198c2f99ee","Type":"ContainerStarted","Data":"800d65b2fa71e65be6bbcb4acc578e900009c8bc57deade7107f24700c3c0ff1"} Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.800552 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kxd22" event={"ID":"3f721de0-e915-40f9-9444-f3135f39072c","Type":"ContainerDied","Data":"4d6187a2edab5e9a96506e38b7e727ac56bb99ef20e626298b88f9e06bba0a6d"} Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.801954 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d6187a2edab5e9a96506e38b7e727ac56bb99ef20e626298b88f9e06bba0a6d" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.802268 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kxd22" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.849140 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.870074 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-config" (OuterVolumeSpecName: "config") pod "3f721de0-e915-40f9-9444-f3135f39072c" (UID: "3f721de0-e915-40f9-9444-f3135f39072c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.904657 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d96cd6c9c-6p6rp"] Dec 16 07:12:53 crc kubenswrapper[4789]: I1216 07:12:53.953254 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f721de0-e915-40f9-9444-f3135f39072c-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.035775 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d96cd6c9c-6p6rp"] Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.204690 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea93c850-0d3d-42f5-9e00-340ea2398cdd" path="/var/lib/kubelet/pods/ea93c850-0d3d-42f5-9e00-340ea2398cdd/volumes" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.218555 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-296d6"] Dec 16 07:12:54 crc kubenswrapper[4789]: E1216 07:12:54.245148 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f721de0-e915-40f9-9444-f3135f39072c" containerName="neutron-db-sync" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.245175 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f721de0-e915-40f9-9444-f3135f39072c" containerName="neutron-db-sync" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.245457 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f721de0-e915-40f9-9444-f3135f39072c" containerName="neutron-db-sync" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.246258 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-296d6"] Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.246282 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.246360 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.258118 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-584d7ccd9b-5jc2j"] Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.259869 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.263135 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-584d7ccd9b-5jc2j"] Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.269102 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vlvtd" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.272237 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.272441 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.272605 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.406616 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.407485 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.407525 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.409163 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-ovndb-tls-certs\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.409248 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhvk9\" (UniqueName: \"kubernetes.io/projected/b5c97ba8-23ab-45c0-82fa-4260a301b089-kube-api-access-qhvk9\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.409435 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.410351 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-httpd-config\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.410402 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-config\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.410450 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-config\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.410540 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfchb\" (UniqueName: \"kubernetes.io/projected/9be1b65d-5e37-46d5-b7b8-ffd770eac023-kube-api-access-rfchb\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.410575 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-combined-ca-bundle\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.443654 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:12:54 crc kubenswrapper[4789]: W1216 07:12:54.467428 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf327880_91aa_47d8_8a6c_9b44ec3fd86c.slice/crio-ace60c61865e2761c20361df902fa2b13d70f3547a00ffe848db1ad429a15036 WatchSource:0}: Error finding container ace60c61865e2761c20361df902fa2b13d70f3547a00ffe848db1ad429a15036: Status 404 returned error can't find the container with id ace60c61865e2761c20361df902fa2b13d70f3547a00ffe848db1ad429a15036 Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513650 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-httpd-config\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513724 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-config\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513766 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-config\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513840 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfchb\" (UniqueName: \"kubernetes.io/projected/9be1b65d-5e37-46d5-b7b8-ffd770eac023-kube-api-access-rfchb\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513868 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-combined-ca-bundle\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513896 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513943 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513967 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.513985 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-ovndb-tls-certs\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.514020 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhvk9\" (UniqueName: \"kubernetes.io/projected/b5c97ba8-23ab-45c0-82fa-4260a301b089-kube-api-access-qhvk9\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.514039 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.516305 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-nb\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.516572 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-config\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.517214 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-svc\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.514952 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-swift-storage-0\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.518083 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-sb\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.522372 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-combined-ca-bundle\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.523007 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-httpd-config\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.525738 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-config\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.536427 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-ovndb-tls-certs\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.552708 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhvk9\" (UniqueName: \"kubernetes.io/projected/b5c97ba8-23ab-45c0-82fa-4260a301b089-kube-api-access-qhvk9\") pod \"dnsmasq-dns-75dbb546bf-296d6\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.552893 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfchb\" (UniqueName: \"kubernetes.io/projected/9be1b65d-5e37-46d5-b7b8-ffd770eac023-kube-api-access-rfchb\") pod \"neutron-584d7ccd9b-5jc2j\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.595535 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.596388 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.864525 4789 generic.go:334] "Generic (PLEG): container finished" podID="6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" containerID="805bf1ccd822bda455b7356f97f764273a5c30dc7256e8fde32bfc76dd838ce9" exitCode=0 Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.864607 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" event={"ID":"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962","Type":"ContainerDied","Data":"805bf1ccd822bda455b7356f97f764273a5c30dc7256e8fde32bfc76dd838ce9"} Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.864951 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" event={"ID":"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962","Type":"ContainerStarted","Data":"4d45ec5ee69a8e2fa564cb000833767cae830a6cfc26ac8136cc3c693f0df0fd"} Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.889771 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerStarted","Data":"ace60c61865e2761c20361df902fa2b13d70f3547a00ffe848db1ad429a15036"} Dec 16 07:12:54 crc kubenswrapper[4789]: I1216 07:12:54.891882 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3be47c97-0d9a-49a2-bad8-8e0c331c1074","Type":"ContainerStarted","Data":"3efb5481e126cb45f598fdb9bee8938749d9aa467801137c96822230f712f079"} Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.525792 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.666421 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.677415 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-nb\") pod \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.677533 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-swift-storage-0\") pod \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.677591 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb\") pod \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.677636 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-config\") pod \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.677669 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqdgn\" (UniqueName: \"kubernetes.io/projected/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-kube-api-access-dqdgn\") pod \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.677693 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-svc\") pod \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.752856 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-kube-api-access-dqdgn" (OuterVolumeSpecName: "kube-api-access-dqdgn") pod "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" (UID: "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962"). InnerVolumeSpecName "kube-api-access-dqdgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.779123 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqdgn\" (UniqueName: \"kubernetes.io/projected/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-kube-api-access-dqdgn\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.882779 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" (UID: "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.882996 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb\") pod \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\" (UID: \"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962\") " Dec 16 07:12:55 crc kubenswrapper[4789]: W1216 07:12:55.883069 4789 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962/volumes/kubernetes.io~configmap/ovsdbserver-sb Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.883091 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" (UID: "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.884014 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.898322 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-296d6"] Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.930866 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" (UID: "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.936385 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-config" (OuterVolumeSpecName: "config") pod "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" (UID: "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.982126 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" (UID: "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.983684 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" event={"ID":"6a37b543-0c84-4a5f-a23c-ce1bb1ff3962","Type":"ContainerDied","Data":"4d45ec5ee69a8e2fa564cb000833767cae830a6cfc26ac8136cc3c693f0df0fd"} Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.983729 4789 scope.go:117] "RemoveContainer" containerID="805bf1ccd822bda455b7356f97f764273a5c30dc7256e8fde32bfc76dd838ce9" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.984707 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d96cd6c9c-6p6rp" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.985340 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" (UID: "6a37b543-0c84-4a5f-a23c-ce1bb1ff3962"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.987290 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.987316 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.987328 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:55 crc kubenswrapper[4789]: I1216 07:12:55.987336 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.074169 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-584d7ccd9b-5jc2j"] Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.250123 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.423044 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d96cd6c9c-6p6rp"] Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.495666 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d96cd6c9c-6p6rp"] Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.508725 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.591097 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8f6654c7b-cm5sf"] Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.591297 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8f6654c7b-cm5sf" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api-log" containerID="cri-o://2e7281997b6da968c5c0f17931656533f9519f8f8bf9919d3f5f3ac881bce263" gracePeriod=30 Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.591661 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8f6654c7b-cm5sf" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api" containerID="cri-o://9bbe35556e1b3bea7379d840e0600ff1ce16e3961c559c5ea3950f67a8e60cb2" gracePeriod=30 Dec 16 07:12:56 crc kubenswrapper[4789]: I1216 07:12:56.631155 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8f6654c7b-cm5sf" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.031242 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1a18c04-e865-4967-83e2-96198c2f99ee","Type":"ContainerStarted","Data":"5a8f96797d140ba03c1aa349575e39d73c6baa0ee541351d5a9d7643d766738b"} Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.043112 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3be47c97-0d9a-49a2-bad8-8e0c331c1074","Type":"ContainerStarted","Data":"5c1b5d0919acdd9882e8e1740dc0ca0b42fa0cba5a0a4fa31d929980dd194a37"} Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.075957 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584d7ccd9b-5jc2j" event={"ID":"9be1b65d-5e37-46d5-b7b8-ffd770eac023","Type":"ContainerStarted","Data":"23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828"} Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.076013 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584d7ccd9b-5jc2j" event={"ID":"9be1b65d-5e37-46d5-b7b8-ffd770eac023","Type":"ContainerStarted","Data":"e6eda3cb332bfaf175685b026abb30c0bf34283ab73588e6da327cd9d7fe1e43"} Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.082065 4789 generic.go:334] "Generic (PLEG): container finished" podID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerID="2e7281997b6da968c5c0f17931656533f9519f8f8bf9919d3f5f3ac881bce263" exitCode=143 Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.082139 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8f6654c7b-cm5sf" event={"ID":"765d73af-7fc6-49af-8c1e-e7558e7f5350","Type":"ContainerDied","Data":"2e7281997b6da968c5c0f17931656533f9519f8f8bf9919d3f5f3ac881bce263"} Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.088282 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerStarted","Data":"cb9d68d7976543c4e65c36ed7dd1ec2471f3528f610140fbe2a0663fb1ea6c9d"} Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.092414 4789 generic.go:334] "Generic (PLEG): container finished" podID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerID="d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322" exitCode=0 Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.092459 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" event={"ID":"b5c97ba8-23ab-45c0-82fa-4260a301b089","Type":"ContainerDied","Data":"d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322"} Dec 16 07:12:57 crc kubenswrapper[4789]: I1216 07:12:57.092481 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" event={"ID":"b5c97ba8-23ab-45c0-82fa-4260a301b089","Type":"ContainerStarted","Data":"ec192f7b6089cf9d77b9c1a03241fe97cf527c845504ce7c5564b0b43bc5048f"} Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.115734 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api-log" containerID="cri-o://5c1b5d0919acdd9882e8e1740dc0ca0b42fa0cba5a0a4fa31d929980dd194a37" gracePeriod=30 Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.115780 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api" containerID="cri-o://9075b8be41b3de9d20d2cfcd834e9f38a17878dca5a2a2cc49d4db50f9c38251" gracePeriod=30 Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.123950 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" path="/var/lib/kubelet/pods/6a37b543-0c84-4a5f-a23c-ce1bb1ff3962/volumes" Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.124580 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1a18c04-e865-4967-83e2-96198c2f99ee","Type":"ContainerStarted","Data":"f869c499dafdb0c40215188b916a6e3833a3f41e0dbc7e97a39d1b70ad3b5539"} Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.124621 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3be47c97-0d9a-49a2-bad8-8e0c331c1074","Type":"ContainerStarted","Data":"9075b8be41b3de9d20d2cfcd834e9f38a17878dca5a2a2cc49d4db50f9c38251"} Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.126602 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584d7ccd9b-5jc2j" event={"ID":"9be1b65d-5e37-46d5-b7b8-ffd770eac023","Type":"ContainerStarted","Data":"3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe"} Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.127551 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.131301 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" event={"ID":"b5c97ba8-23ab-45c0-82fa-4260a301b089","Type":"ContainerStarted","Data":"f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac"} Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.132234 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.138678 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.762839673 podStartE2EDuration="7.138660392s" podCreationTimestamp="2025-12-16 07:12:51 +0000 UTC" firstStartedPulling="2025-12-16 07:12:53.219256649 +0000 UTC m=+1311.481144278" lastFinishedPulling="2025-12-16 07:12:54.595077368 +0000 UTC m=+1312.856964997" observedRunningTime="2025-12-16 07:12:58.133882646 +0000 UTC m=+1316.395770275" watchObservedRunningTime="2025-12-16 07:12:58.138660392 +0000 UTC m=+1316.400548021" Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.162100 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-584d7ccd9b-5jc2j" podStartSLOduration=4.162076641 podStartE2EDuration="4.162076641s" podCreationTimestamp="2025-12-16 07:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:58.157438208 +0000 UTC m=+1316.419325837" watchObservedRunningTime="2025-12-16 07:12:58.162076641 +0000 UTC m=+1316.423964270" Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.186317 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.186296659 podStartE2EDuration="6.186296659s" podCreationTimestamp="2025-12-16 07:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:58.185449309 +0000 UTC m=+1316.447336948" watchObservedRunningTime="2025-12-16 07:12:58.186296659 +0000 UTC m=+1316.448184288" Dec 16 07:12:58 crc kubenswrapper[4789]: I1216 07:12:58.220283 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" podStartSLOduration=5.220266075 podStartE2EDuration="5.220266075s" podCreationTimestamp="2025-12-16 07:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:12:58.211327717 +0000 UTC m=+1316.473215366" watchObservedRunningTime="2025-12-16 07:12:58.220266075 +0000 UTC m=+1316.482153704" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.141502 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerStarted","Data":"9f0d9f99d3b06828a566d4f372b298f3a80e1ef7ec89935e786c589bc7ffc504"} Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.143935 4789 generic.go:334] "Generic (PLEG): container finished" podID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerID="9075b8be41b3de9d20d2cfcd834e9f38a17878dca5a2a2cc49d4db50f9c38251" exitCode=0 Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.143966 4789 generic.go:334] "Generic (PLEG): container finished" podID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerID="5c1b5d0919acdd9882e8e1740dc0ca0b42fa0cba5a0a4fa31d929980dd194a37" exitCode=143 Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.143943 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3be47c97-0d9a-49a2-bad8-8e0c331c1074","Type":"ContainerDied","Data":"9075b8be41b3de9d20d2cfcd834e9f38a17878dca5a2a2cc49d4db50f9c38251"} Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.144147 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3be47c97-0d9a-49a2-bad8-8e0c331c1074","Type":"ContainerDied","Data":"5c1b5d0919acdd9882e8e1740dc0ca0b42fa0cba5a0a4fa31d929980dd194a37"} Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.221712 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5787d477bc-ccrwj"] Dec 16 07:12:59 crc kubenswrapper[4789]: E1216 07:12:59.222153 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" containerName="init" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.222172 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" containerName="init" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.222339 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a37b543-0c84-4a5f-a23c-ce1bb1ff3962" containerName="init" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.223873 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.228982 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.230847 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.252739 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5787d477bc-ccrwj"] Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.376871 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-public-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.377450 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-internal-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.377487 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-combined-ca-bundle\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.377524 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhs2\" (UniqueName: \"kubernetes.io/projected/73660d16-d925-4e43-8df7-2c40959bb7ed-kube-api-access-hnhs2\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.377581 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-ovndb-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.378095 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-config\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.378275 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-httpd-config\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.481954 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-config\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.482049 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-httpd-config\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.482133 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-public-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.482333 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-internal-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.482364 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-combined-ca-bundle\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.482391 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhs2\" (UniqueName: \"kubernetes.io/projected/73660d16-d925-4e43-8df7-2c40959bb7ed-kube-api-access-hnhs2\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.482437 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-ovndb-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.490668 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-public-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.493001 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-config\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.494064 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-httpd-config\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.495648 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-internal-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.495702 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-combined-ca-bundle\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.497862 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-ovndb-tls-certs\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.511565 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhs2\" (UniqueName: \"kubernetes.io/projected/73660d16-d925-4e43-8df7-2c40959bb7ed-kube-api-access-hnhs2\") pod \"neutron-5787d477bc-ccrwj\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.541449 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.744204 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.890515 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwmzk\" (UniqueName: \"kubernetes.io/projected/3be47c97-0d9a-49a2-bad8-8e0c331c1074-kube-api-access-lwmzk\") pod \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.890703 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be47c97-0d9a-49a2-bad8-8e0c331c1074-logs\") pod \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.890802 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-combined-ca-bundle\") pod \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.890865 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3be47c97-0d9a-49a2-bad8-8e0c331c1074-etc-machine-id\") pod \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.890903 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-scripts\") pod \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.890951 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data\") pod \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.890987 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data-custom\") pod \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\" (UID: \"3be47c97-0d9a-49a2-bad8-8e0c331c1074\") " Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.891330 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be47c97-0d9a-49a2-bad8-8e0c331c1074-logs" (OuterVolumeSpecName: "logs") pod "3be47c97-0d9a-49a2-bad8-8e0c331c1074" (UID: "3be47c97-0d9a-49a2-bad8-8e0c331c1074"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.891670 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3be47c97-0d9a-49a2-bad8-8e0c331c1074-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3be47c97-0d9a-49a2-bad8-8e0c331c1074" (UID: "3be47c97-0d9a-49a2-bad8-8e0c331c1074"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.902265 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3be47c97-0d9a-49a2-bad8-8e0c331c1074" (UID: "3be47c97-0d9a-49a2-bad8-8e0c331c1074"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.902478 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be47c97-0d9a-49a2-bad8-8e0c331c1074-kube-api-access-lwmzk" (OuterVolumeSpecName: "kube-api-access-lwmzk") pod "3be47c97-0d9a-49a2-bad8-8e0c331c1074" (UID: "3be47c97-0d9a-49a2-bad8-8e0c331c1074"). InnerVolumeSpecName "kube-api-access-lwmzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.904132 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-scripts" (OuterVolumeSpecName: "scripts") pod "3be47c97-0d9a-49a2-bad8-8e0c331c1074" (UID: "3be47c97-0d9a-49a2-bad8-8e0c331c1074"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.948723 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3be47c97-0d9a-49a2-bad8-8e0c331c1074" (UID: "3be47c97-0d9a-49a2-bad8-8e0c331c1074"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.967094 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data" (OuterVolumeSpecName: "config-data") pod "3be47c97-0d9a-49a2-bad8-8e0c331c1074" (UID: "3be47c97-0d9a-49a2-bad8-8e0c331c1074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.992892 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be47c97-0d9a-49a2-bad8-8e0c331c1074-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.993027 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.993043 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3be47c97-0d9a-49a2-bad8-8e0c331c1074-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.993054 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.993068 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.993079 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3be47c97-0d9a-49a2-bad8-8e0c331c1074-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:12:59 crc kubenswrapper[4789]: I1216 07:12:59.993115 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwmzk\" (UniqueName: \"kubernetes.io/projected/3be47c97-0d9a-49a2-bad8-8e0c331c1074-kube-api-access-lwmzk\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.155797 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerStarted","Data":"61cee93c6ad5fb754fb2d1a17fe445e063006df0345da5c32bc08a1cebe830f7"} Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.159170 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3be47c97-0d9a-49a2-bad8-8e0c331c1074","Type":"ContainerDied","Data":"3efb5481e126cb45f598fdb9bee8938749d9aa467801137c96822230f712f079"} Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.159250 4789 scope.go:117] "RemoveContainer" containerID="9075b8be41b3de9d20d2cfcd834e9f38a17878dca5a2a2cc49d4db50f9c38251" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.159495 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.190811 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.212743 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.216217 4789 scope.go:117] "RemoveContainer" containerID="5c1b5d0919acdd9882e8e1740dc0ca0b42fa0cba5a0a4fa31d929980dd194a37" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.227686 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:13:00 crc kubenswrapper[4789]: E1216 07:13:00.228196 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api-log" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.228222 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api-log" Dec 16 07:13:00 crc kubenswrapper[4789]: E1216 07:13:00.228236 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.228246 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.228485 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.228506 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" containerName="cinder-api-log" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.229820 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.235751 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.236084 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.236335 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.250536 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.265621 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5787d477bc-ccrwj"] Dec 16 07:13:00 crc kubenswrapper[4789]: W1216 07:13:00.272180 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73660d16_d925_4e43_8df7_2c40959bb7ed.slice/crio-7b507608a565ab382c678b5753862f64a6fdcc81cac0a58dd680a8bde4b844de WatchSource:0}: Error finding container 7b507608a565ab382c678b5753862f64a6fdcc81cac0a58dd680a8bde4b844de: Status 404 returned error can't find the container with id 7b507608a565ab382c678b5753862f64a6fdcc81cac0a58dd680a8bde4b844de Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.299628 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.299755 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data-custom\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.299797 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.299823 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skc8q\" (UniqueName: \"kubernetes.io/projected/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-kube-api-access-skc8q\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.299856 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.299964 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-scripts\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.299990 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.300053 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.300751 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-logs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403048 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403111 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skc8q\" (UniqueName: \"kubernetes.io/projected/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-kube-api-access-skc8q\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403156 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403238 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-scripts\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403266 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403296 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403316 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-logs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403397 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.403466 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data-custom\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.404134 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.405462 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-logs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.408148 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.408214 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.408790 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-scripts\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.409144 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.410179 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data-custom\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.415444 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.429270 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skc8q\" (UniqueName: \"kubernetes.io/projected/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-kube-api-access-skc8q\") pod \"cinder-api-0\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " pod="openstack/cinder-api-0" Dec 16 07:13:00 crc kubenswrapper[4789]: I1216 07:13:00.556969 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.069789 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8f6654c7b-cm5sf" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:41182->10.217.0.154:9311: read: connection reset by peer" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.069870 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8f6654c7b-cm5sf" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:41170->10.217.0.154:9311: read: connection reset by peer" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.090093 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.170017 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5787d477bc-ccrwj" event={"ID":"73660d16-d925-4e43-8df7-2c40959bb7ed","Type":"ContainerStarted","Data":"d61236e0a1b169ed76d6b190800ead5dd0b19f9acc8d953c9f3b75b5c79591fd"} Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.170053 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5787d477bc-ccrwj" event={"ID":"73660d16-d925-4e43-8df7-2c40959bb7ed","Type":"ContainerStarted","Data":"1811fc6d133a6d47f93c7b8e7704ffe66b0cb1ade5e47088042f32756e1e0944"} Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.170063 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5787d477bc-ccrwj" event={"ID":"73660d16-d925-4e43-8df7-2c40959bb7ed","Type":"ContainerStarted","Data":"7b507608a565ab382c678b5753862f64a6fdcc81cac0a58dd680a8bde4b844de"} Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.171085 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.172975 4789 generic.go:334] "Generic (PLEG): container finished" podID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerID="9bbe35556e1b3bea7379d840e0600ff1ce16e3961c559c5ea3950f67a8e60cb2" exitCode=0 Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.173041 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8f6654c7b-cm5sf" event={"ID":"765d73af-7fc6-49af-8c1e-e7558e7f5350","Type":"ContainerDied","Data":"9bbe35556e1b3bea7379d840e0600ff1ce16e3961c559c5ea3950f67a8e60cb2"} Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.183161 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerStarted","Data":"b556dfe0b5cec3e2265012d23751e66482601957b63991c82cf1f1f9f278a236"} Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.183316 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.184974 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5","Type":"ContainerStarted","Data":"c06be19a5d6804f3e384f9fe32b42d0aabb7abea807e440b6d2e5268f311be81"} Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.195028 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5787d477bc-ccrwj" podStartSLOduration=2.195011041 podStartE2EDuration="2.195011041s" podCreationTimestamp="2025-12-16 07:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:01.191622829 +0000 UTC m=+1319.453510458" watchObservedRunningTime="2025-12-16 07:13:01.195011041 +0000 UTC m=+1319.456898660" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.233802 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8930152959999997 podStartE2EDuration="9.233782723s" podCreationTimestamp="2025-12-16 07:12:52 +0000 UTC" firstStartedPulling="2025-12-16 07:12:54.475100334 +0000 UTC m=+1312.736987963" lastFinishedPulling="2025-12-16 07:13:00.815867761 +0000 UTC m=+1319.077755390" observedRunningTime="2025-12-16 07:13:01.220813718 +0000 UTC m=+1319.482701347" watchObservedRunningTime="2025-12-16 07:13:01.233782723 +0000 UTC m=+1319.495670352" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.430784 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.531495 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9q7z\" (UniqueName: \"kubernetes.io/projected/765d73af-7fc6-49af-8c1e-e7558e7f5350-kube-api-access-f9q7z\") pod \"765d73af-7fc6-49af-8c1e-e7558e7f5350\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.531627 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/765d73af-7fc6-49af-8c1e-e7558e7f5350-logs\") pod \"765d73af-7fc6-49af-8c1e-e7558e7f5350\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.531820 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-combined-ca-bundle\") pod \"765d73af-7fc6-49af-8c1e-e7558e7f5350\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.531875 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data\") pod \"765d73af-7fc6-49af-8c1e-e7558e7f5350\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.531963 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data-custom\") pod \"765d73af-7fc6-49af-8c1e-e7558e7f5350\" (UID: \"765d73af-7fc6-49af-8c1e-e7558e7f5350\") " Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.538377 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765d73af-7fc6-49af-8c1e-e7558e7f5350-logs" (OuterVolumeSpecName: "logs") pod "765d73af-7fc6-49af-8c1e-e7558e7f5350" (UID: "765d73af-7fc6-49af-8c1e-e7558e7f5350"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.549824 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765d73af-7fc6-49af-8c1e-e7558e7f5350-kube-api-access-f9q7z" (OuterVolumeSpecName: "kube-api-access-f9q7z") pod "765d73af-7fc6-49af-8c1e-e7558e7f5350" (UID: "765d73af-7fc6-49af-8c1e-e7558e7f5350"). InnerVolumeSpecName "kube-api-access-f9q7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.551071 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "765d73af-7fc6-49af-8c1e-e7558e7f5350" (UID: "765d73af-7fc6-49af-8c1e-e7558e7f5350"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.602107 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "765d73af-7fc6-49af-8c1e-e7558e7f5350" (UID: "765d73af-7fc6-49af-8c1e-e7558e7f5350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.645164 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.645205 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9q7z\" (UniqueName: \"kubernetes.io/projected/765d73af-7fc6-49af-8c1e-e7558e7f5350-kube-api-access-f9q7z\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.645217 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/765d73af-7fc6-49af-8c1e-e7558e7f5350-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.645227 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.653058 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data" (OuterVolumeSpecName: "config-data") pod "765d73af-7fc6-49af-8c1e-e7558e7f5350" (UID: "765d73af-7fc6-49af-8c1e-e7558e7f5350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:01 crc kubenswrapper[4789]: I1216 07:13:01.746522 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d73af-7fc6-49af-8c1e-e7558e7f5350-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.120631 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be47c97-0d9a-49a2-bad8-8e0c331c1074" path="/var/lib/kubelet/pods/3be47c97-0d9a-49a2-bad8-8e0c331c1074/volumes" Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.203128 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8f6654c7b-cm5sf" event={"ID":"765d73af-7fc6-49af-8c1e-e7558e7f5350","Type":"ContainerDied","Data":"70112c2d4331dfccf9367ce3625a7cd7198abb4a46517f0a9d47e7e66f3b1748"} Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.203420 4789 scope.go:117] "RemoveContainer" containerID="9bbe35556e1b3bea7379d840e0600ff1ce16e3961c559c5ea3950f67a8e60cb2" Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.203534 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8f6654c7b-cm5sf" Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.210142 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5","Type":"ContainerStarted","Data":"075adba855be9f7509e9630110074278e486377f145b6b3fe7500199bbeb6d6c"} Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.235220 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8f6654c7b-cm5sf"] Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.245344 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8f6654c7b-cm5sf"] Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.249219 4789 scope.go:117] "RemoveContainer" containerID="2e7281997b6da968c5c0f17931656533f9519f8f8bf9919d3f5f3ac881bce263" Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.340577 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 07:13:02 crc kubenswrapper[4789]: I1216 07:13:02.630606 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 07:13:03 crc kubenswrapper[4789]: I1216 07:13:03.218759 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5","Type":"ContainerStarted","Data":"dfe47974cb64535408aeb67063f1a4814aa8aaad5cefa3463823fe0dd085e7b6"} Dec 16 07:13:03 crc kubenswrapper[4789]: I1216 07:13:03.219211 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 07:13:03 crc kubenswrapper[4789]: I1216 07:13:03.240703 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.240682231 podStartE2EDuration="3.240682231s" podCreationTimestamp="2025-12-16 07:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:03.237279618 +0000 UTC m=+1321.499167257" watchObservedRunningTime="2025-12-16 07:13:03.240682231 +0000 UTC m=+1321.502569860" Dec 16 07:13:03 crc kubenswrapper[4789]: I1216 07:13:03.296734 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:13:04 crc kubenswrapper[4789]: I1216 07:13:04.114498 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" path="/var/lib/kubelet/pods/765d73af-7fc6-49af-8c1e-e7558e7f5350/volumes" Dec 16 07:13:04 crc kubenswrapper[4789]: I1216 07:13:04.229218 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="cinder-scheduler" containerID="cri-o://5a8f96797d140ba03c1aa349575e39d73c6baa0ee541351d5a9d7643d766738b" gracePeriod=30 Dec 16 07:13:04 crc kubenswrapper[4789]: I1216 07:13:04.229663 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="probe" containerID="cri-o://f869c499dafdb0c40215188b916a6e3833a3f41e0dbc7e97a39d1b70ad3b5539" gracePeriod=30 Dec 16 07:13:04 crc kubenswrapper[4789]: I1216 07:13:04.598319 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:13:04 crc kubenswrapper[4789]: I1216 07:13:04.660860 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8fffc8985-mdm7j"] Dec 16 07:13:04 crc kubenswrapper[4789]: I1216 07:13:04.661130 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" podUID="0d472548-77d0-45dc-846c-b36f55c20d05" containerName="dnsmasq-dns" containerID="cri-o://c41d925e9962e9941cadb03150aa4d3844326a4c3c3b013fef2f48da381dfd6e" gracePeriod=10 Dec 16 07:13:05 crc kubenswrapper[4789]: I1216 07:13:05.240312 4789 generic.go:334] "Generic (PLEG): container finished" podID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerID="f869c499dafdb0c40215188b916a6e3833a3f41e0dbc7e97a39d1b70ad3b5539" exitCode=0 Dec 16 07:13:05 crc kubenswrapper[4789]: I1216 07:13:05.240367 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1a18c04-e865-4967-83e2-96198c2f99ee","Type":"ContainerDied","Data":"f869c499dafdb0c40215188b916a6e3833a3f41e0dbc7e97a39d1b70ad3b5539"} Dec 16 07:13:05 crc kubenswrapper[4789]: I1216 07:13:05.243312 4789 generic.go:334] "Generic (PLEG): container finished" podID="0d472548-77d0-45dc-846c-b36f55c20d05" containerID="c41d925e9962e9941cadb03150aa4d3844326a4c3c3b013fef2f48da381dfd6e" exitCode=0 Dec 16 07:13:05 crc kubenswrapper[4789]: I1216 07:13:05.243356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" event={"ID":"0d472548-77d0-45dc-846c-b36f55c20d05","Type":"ContainerDied","Data":"c41d925e9962e9941cadb03150aa4d3844326a4c3c3b013fef2f48da381dfd6e"} Dec 16 07:13:05 crc kubenswrapper[4789]: I1216 07:13:05.972629 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.049611 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-sb\") pod \"0d472548-77d0-45dc-846c-b36f55c20d05\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.049796 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-config\") pod \"0d472548-77d0-45dc-846c-b36f55c20d05\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.049814 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-nb\") pod \"0d472548-77d0-45dc-846c-b36f55c20d05\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.049833 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt7sp\" (UniqueName: \"kubernetes.io/projected/0d472548-77d0-45dc-846c-b36f55c20d05-kube-api-access-vt7sp\") pod \"0d472548-77d0-45dc-846c-b36f55c20d05\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.049892 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-svc\") pod \"0d472548-77d0-45dc-846c-b36f55c20d05\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.049925 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-swift-storage-0\") pod \"0d472548-77d0-45dc-846c-b36f55c20d05\" (UID: \"0d472548-77d0-45dc-846c-b36f55c20d05\") " Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.055471 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d472548-77d0-45dc-846c-b36f55c20d05-kube-api-access-vt7sp" (OuterVolumeSpecName: "kube-api-access-vt7sp") pod "0d472548-77d0-45dc-846c-b36f55c20d05" (UID: "0d472548-77d0-45dc-846c-b36f55c20d05"). InnerVolumeSpecName "kube-api-access-vt7sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.093586 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d472548-77d0-45dc-846c-b36f55c20d05" (UID: "0d472548-77d0-45dc-846c-b36f55c20d05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.094664 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d472548-77d0-45dc-846c-b36f55c20d05" (UID: "0d472548-77d0-45dc-846c-b36f55c20d05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.096944 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0d472548-77d0-45dc-846c-b36f55c20d05" (UID: "0d472548-77d0-45dc-846c-b36f55c20d05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.097473 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-config" (OuterVolumeSpecName: "config") pod "0d472548-77d0-45dc-846c-b36f55c20d05" (UID: "0d472548-77d0-45dc-846c-b36f55c20d05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.103775 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d472548-77d0-45dc-846c-b36f55c20d05" (UID: "0d472548-77d0-45dc-846c-b36f55c20d05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.152175 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.152209 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.152218 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.152230 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt7sp\" (UniqueName: \"kubernetes.io/projected/0d472548-77d0-45dc-846c-b36f55c20d05-kube-api-access-vt7sp\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.152239 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.152247 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d472548-77d0-45dc-846c-b36f55c20d05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.253069 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" event={"ID":"0d472548-77d0-45dc-846c-b36f55c20d05","Type":"ContainerDied","Data":"0e24f62d7f83361d7342c0cdd24256d092a736eb696dcb4ddcfc298b3f05e9ae"} Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.253117 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fffc8985-mdm7j" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.253155 4789 scope.go:117] "RemoveContainer" containerID="c41d925e9962e9941cadb03150aa4d3844326a4c3c3b013fef2f48da381dfd6e" Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.281703 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8fffc8985-mdm7j"] Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.291679 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8fffc8985-mdm7j"] Dec 16 07:13:06 crc kubenswrapper[4789]: I1216 07:13:06.294112 4789 scope.go:117] "RemoveContainer" containerID="a982c1baac7a3603c6318a205667d362036ffc12d8dcc384c21001c00f1bcb52" Dec 16 07:13:07 crc kubenswrapper[4789]: I1216 07:13:07.261367 4789 generic.go:334] "Generic (PLEG): container finished" podID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerID="5a8f96797d140ba03c1aa349575e39d73c6baa0ee541351d5a9d7643d766738b" exitCode=0 Dec 16 07:13:07 crc kubenswrapper[4789]: I1216 07:13:07.261439 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1a18c04-e865-4967-83e2-96198c2f99ee","Type":"ContainerDied","Data":"5a8f96797d140ba03c1aa349575e39d73c6baa0ee541351d5a9d7643d766738b"} Dec 16 07:13:07 crc kubenswrapper[4789]: I1216 07:13:07.880186 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.007748 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1a18c04-e865-4967-83e2-96198c2f99ee-etc-machine-id\") pod \"b1a18c04-e865-4967-83e2-96198c2f99ee\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.007797 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data-custom\") pod \"b1a18c04-e865-4967-83e2-96198c2f99ee\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.007876 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q8qq\" (UniqueName: \"kubernetes.io/projected/b1a18c04-e865-4967-83e2-96198c2f99ee-kube-api-access-5q8qq\") pod \"b1a18c04-e865-4967-83e2-96198c2f99ee\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.007902 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1a18c04-e865-4967-83e2-96198c2f99ee-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b1a18c04-e865-4967-83e2-96198c2f99ee" (UID: "b1a18c04-e865-4967-83e2-96198c2f99ee"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.007930 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-scripts\") pod \"b1a18c04-e865-4967-83e2-96198c2f99ee\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.008015 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-combined-ca-bundle\") pod \"b1a18c04-e865-4967-83e2-96198c2f99ee\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.008054 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data\") pod \"b1a18c04-e865-4967-83e2-96198c2f99ee\" (UID: \"b1a18c04-e865-4967-83e2-96198c2f99ee\") " Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.008543 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1a18c04-e865-4967-83e2-96198c2f99ee-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.015369 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a18c04-e865-4967-83e2-96198c2f99ee-kube-api-access-5q8qq" (OuterVolumeSpecName: "kube-api-access-5q8qq") pod "b1a18c04-e865-4967-83e2-96198c2f99ee" (UID: "b1a18c04-e865-4967-83e2-96198c2f99ee"). InnerVolumeSpecName "kube-api-access-5q8qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.015388 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b1a18c04-e865-4967-83e2-96198c2f99ee" (UID: "b1a18c04-e865-4967-83e2-96198c2f99ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.019015 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-scripts" (OuterVolumeSpecName: "scripts") pod "b1a18c04-e865-4967-83e2-96198c2f99ee" (UID: "b1a18c04-e865-4967-83e2-96198c2f99ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.066279 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1a18c04-e865-4967-83e2-96198c2f99ee" (UID: "b1a18c04-e865-4967-83e2-96198c2f99ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.110501 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q8qq\" (UniqueName: \"kubernetes.io/projected/b1a18c04-e865-4967-83e2-96198c2f99ee-kube-api-access-5q8qq\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.110532 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.110542 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.110550 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.124347 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data" (OuterVolumeSpecName: "config-data") pod "b1a18c04-e865-4967-83e2-96198c2f99ee" (UID: "b1a18c04-e865-4967-83e2-96198c2f99ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.124899 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d472548-77d0-45dc-846c-b36f55c20d05" path="/var/lib/kubelet/pods/0d472548-77d0-45dc-846c-b36f55c20d05/volumes" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.212875 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1a18c04-e865-4967-83e2-96198c2f99ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.273187 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1a18c04-e865-4967-83e2-96198c2f99ee","Type":"ContainerDied","Data":"800d65b2fa71e65be6bbcb4acc578e900009c8bc57deade7107f24700c3c0ff1"} Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.273247 4789 scope.go:117] "RemoveContainer" containerID="f869c499dafdb0c40215188b916a6e3833a3f41e0dbc7e97a39d1b70ad3b5539" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.273381 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.307610 4789 scope.go:117] "RemoveContainer" containerID="5a8f96797d140ba03c1aa349575e39d73c6baa0ee541351d5a9d7643d766738b" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.309682 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.326832 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.362330 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:13:08 crc kubenswrapper[4789]: E1216 07:13:08.362728 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d472548-77d0-45dc-846c-b36f55c20d05" containerName="dnsmasq-dns" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.362747 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d472548-77d0-45dc-846c-b36f55c20d05" containerName="dnsmasq-dns" Dec 16 07:13:08 crc kubenswrapper[4789]: E1216 07:13:08.362756 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.362763 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api" Dec 16 07:13:08 crc kubenswrapper[4789]: E1216 07:13:08.362789 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api-log" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.362795 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api-log" Dec 16 07:13:08 crc kubenswrapper[4789]: E1216 07:13:08.362803 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d472548-77d0-45dc-846c-b36f55c20d05" containerName="init" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.362808 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d472548-77d0-45dc-846c-b36f55c20d05" containerName="init" Dec 16 07:13:08 crc kubenswrapper[4789]: E1216 07:13:08.362820 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="cinder-scheduler" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.362826 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="cinder-scheduler" Dec 16 07:13:08 crc kubenswrapper[4789]: E1216 07:13:08.362839 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="probe" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.362846 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="probe" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.363050 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d472548-77d0-45dc-846c-b36f55c20d05" containerName="dnsmasq-dns" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.363070 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api-log" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.363086 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="765d73af-7fc6-49af-8c1e-e7558e7f5350" containerName="barbican-api" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.363106 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="probe" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.363117 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" containerName="cinder-scheduler" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.366999 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.370532 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.390437 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.427329 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.427424 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.427445 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.427483 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cct4q\" (UniqueName: \"kubernetes.io/projected/de637363-990a-4590-b9c5-ab66c18ec270-kube-api-access-cct4q\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.427542 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-scripts\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.427567 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de637363-990a-4590-b9c5-ab66c18ec270-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.528678 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.528762 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.528782 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.528799 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cct4q\" (UniqueName: \"kubernetes.io/projected/de637363-990a-4590-b9c5-ab66c18ec270-kube-api-access-cct4q\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.529448 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-scripts\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.529528 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de637363-990a-4590-b9c5-ab66c18ec270-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.529674 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de637363-990a-4590-b9c5-ab66c18ec270-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.533096 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.533360 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.533400 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-scripts\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.541244 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.554057 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cct4q\" (UniqueName: \"kubernetes.io/projected/de637363-990a-4590-b9c5-ab66c18ec270-kube-api-access-cct4q\") pod \"cinder-scheduler-0\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " pod="openstack/cinder-scheduler-0" Dec 16 07:13:08 crc kubenswrapper[4789]: I1216 07:13:08.691599 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:13:09 crc kubenswrapper[4789]: I1216 07:13:09.167274 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:13:09 crc kubenswrapper[4789]: I1216 07:13:09.286710 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de637363-990a-4590-b9c5-ab66c18ec270","Type":"ContainerStarted","Data":"226f8c67ed7aa831b0b83d7a7a57da486ad5e31f13bab83cb64f67e17d72ce11"} Dec 16 07:13:09 crc kubenswrapper[4789]: I1216 07:13:09.711050 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:13:10 crc kubenswrapper[4789]: I1216 07:13:10.115244 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a18c04-e865-4967-83e2-96198c2f99ee" path="/var/lib/kubelet/pods/b1a18c04-e865-4967-83e2-96198c2f99ee/volumes" Dec 16 07:13:10 crc kubenswrapper[4789]: I1216 07:13:10.297238 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de637363-990a-4590-b9c5-ab66c18ec270","Type":"ContainerStarted","Data":"7a559de8c4fc233747aed0e14dd0fbf6aa46b087f910b2378d491cf160c0c80e"} Dec 16 07:13:11 crc kubenswrapper[4789]: I1216 07:13:11.308098 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de637363-990a-4590-b9c5-ab66c18ec270","Type":"ContainerStarted","Data":"1fd3bff06a6b8d682fd662de9fa43cca1d63dd71d4ca1dc0b4dd34b7b40fb7d8"} Dec 16 07:13:12 crc kubenswrapper[4789]: I1216 07:13:12.533528 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 07:13:12 crc kubenswrapper[4789]: I1216 07:13:12.561743 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.56172336 podStartE2EDuration="4.56172336s" podCreationTimestamp="2025-12-16 07:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:11.331725333 +0000 UTC m=+1329.593612972" watchObservedRunningTime="2025-12-16 07:13:12.56172336 +0000 UTC m=+1330.823610989" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.307106 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.308141 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.310998 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.311094 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hn6tv" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.311343 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.326045 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.329501 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbr2s\" (UniqueName: \"kubernetes.io/projected/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-kube-api-access-xbr2s\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.329550 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.329692 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.329779 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.432063 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbr2s\" (UniqueName: \"kubernetes.io/projected/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-kube-api-access-xbr2s\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.432132 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.432203 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.432259 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.433113 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.442899 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.442929 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.448962 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbr2s\" (UniqueName: \"kubernetes.io/projected/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-kube-api-access-xbr2s\") pod \"openstackclient\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.629269 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:13:13 crc kubenswrapper[4789]: I1216 07:13:13.692901 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 07:13:14 crc kubenswrapper[4789]: W1216 07:13:14.114269 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0c7fab5_e2bd_4a68_9a72_3eeecb4ce978.slice/crio-bf118a53934d0e1090a31cd4951f55deb7a17f5f30c148c28a0689467732db0e WatchSource:0}: Error finding container bf118a53934d0e1090a31cd4951f55deb7a17f5f30c148c28a0689467732db0e: Status 404 returned error can't find the container with id bf118a53934d0e1090a31cd4951f55deb7a17f5f30c148c28a0689467732db0e Dec 16 07:13:14 crc kubenswrapper[4789]: I1216 07:13:14.120036 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 07:13:14 crc kubenswrapper[4789]: I1216 07:13:14.335059 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978","Type":"ContainerStarted","Data":"bf118a53934d0e1090a31cd4951f55deb7a17f5f30c148c28a0689467732db0e"} Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.606617 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7f7b9cd85-4tf54"] Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.609863 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.626281 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f7b9cd85-4tf54"] Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.627452 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.628069 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.628724 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.641989 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-run-httpd\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.642046 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-public-tls-certs\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.642117 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-internal-tls-certs\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.642144 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.642202 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-log-httpd\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.642275 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-config-data\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.642322 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7stw\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-kube-api-access-t7stw\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.642378 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-combined-ca-bundle\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743516 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-combined-ca-bundle\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743586 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-run-httpd\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743613 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-public-tls-certs\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743652 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-internal-tls-certs\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743674 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743717 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-log-httpd\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743765 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-config-data\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.743795 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7stw\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-kube-api-access-t7stw\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.744986 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-log-httpd\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.745668 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-run-httpd\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.752678 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-combined-ca-bundle\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.753384 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-config-data\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.762998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-public-tls-certs\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.763728 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7stw\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-kube-api-access-t7stw\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.764153 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-internal-tls-certs\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.778342 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift\") pod \"swift-proxy-7f7b9cd85-4tf54\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.953179 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:18 crc kubenswrapper[4789]: I1216 07:13:18.992830 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 07:13:20 crc kubenswrapper[4789]: I1216 07:13:20.593626 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:20 crc kubenswrapper[4789]: I1216 07:13:20.594477 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-central-agent" containerID="cri-o://cb9d68d7976543c4e65c36ed7dd1ec2471f3528f610140fbe2a0663fb1ea6c9d" gracePeriod=30 Dec 16 07:13:20 crc kubenswrapper[4789]: I1216 07:13:20.594570 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="proxy-httpd" containerID="cri-o://b556dfe0b5cec3e2265012d23751e66482601957b63991c82cf1f1f9f278a236" gracePeriod=30 Dec 16 07:13:20 crc kubenswrapper[4789]: I1216 07:13:20.594606 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="sg-core" containerID="cri-o://61cee93c6ad5fb754fb2d1a17fe445e063006df0345da5c32bc08a1cebe830f7" gracePeriod=30 Dec 16 07:13:20 crc kubenswrapper[4789]: I1216 07:13:20.594996 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-notification-agent" containerID="cri-o://9f0d9f99d3b06828a566d4f372b298f3a80e1ef7ec89935e786c589bc7ffc504" gracePeriod=30 Dec 16 07:13:20 crc kubenswrapper[4789]: I1216 07:13:20.601506 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.160:3000/\": EOF" Dec 16 07:13:21 crc kubenswrapper[4789]: I1216 07:13:21.421620 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerID="b556dfe0b5cec3e2265012d23751e66482601957b63991c82cf1f1f9f278a236" exitCode=0 Dec 16 07:13:21 crc kubenswrapper[4789]: I1216 07:13:21.421990 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerID="61cee93c6ad5fb754fb2d1a17fe445e063006df0345da5c32bc08a1cebe830f7" exitCode=2 Dec 16 07:13:21 crc kubenswrapper[4789]: I1216 07:13:21.422002 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerID="cb9d68d7976543c4e65c36ed7dd1ec2471f3528f610140fbe2a0663fb1ea6c9d" exitCode=0 Dec 16 07:13:21 crc kubenswrapper[4789]: I1216 07:13:21.421696 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerDied","Data":"b556dfe0b5cec3e2265012d23751e66482601957b63991c82cf1f1f9f278a236"} Dec 16 07:13:21 crc kubenswrapper[4789]: I1216 07:13:21.422044 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerDied","Data":"61cee93c6ad5fb754fb2d1a17fe445e063006df0345da5c32bc08a1cebe830f7"} Dec 16 07:13:21 crc kubenswrapper[4789]: I1216 07:13:21.422059 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerDied","Data":"cb9d68d7976543c4e65c36ed7dd1ec2471f3528f610140fbe2a0663fb1ea6c9d"} Dec 16 07:13:22 crc kubenswrapper[4789]: I1216 07:13:22.042894 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:13:22 crc kubenswrapper[4789]: I1216 07:13:22.043901 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:13:22 crc kubenswrapper[4789]: I1216 07:13:22.437840 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerID="9f0d9f99d3b06828a566d4f372b298f3a80e1ef7ec89935e786c589bc7ffc504" exitCode=0 Dec 16 07:13:22 crc kubenswrapper[4789]: I1216 07:13:22.437896 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerDied","Data":"9f0d9f99d3b06828a566d4f372b298f3a80e1ef7ec89935e786c589bc7ffc504"} Dec 16 07:13:23 crc kubenswrapper[4789]: I1216 07:13:23.308118 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.160:3000/\": dial tcp 10.217.0.160:3000: connect: connection refused" Dec 16 07:13:24 crc kubenswrapper[4789]: I1216 07:13:24.609333 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.164237 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.180461 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-scripts\") pod \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.180552 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-combined-ca-bundle\") pod \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.180698 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l28s4\" (UniqueName: \"kubernetes.io/projected/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-kube-api-access-l28s4\") pod \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.180754 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-sg-core-conf-yaml\") pod \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.180783 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-run-httpd\") pod \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.180815 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-config-data\") pod \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.180881 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-log-httpd\") pod \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\" (UID: \"bf327880-91aa-47d8-8a6c-9b44ec3fd86c\") " Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.181838 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bf327880-91aa-47d8-8a6c-9b44ec3fd86c" (UID: "bf327880-91aa-47d8-8a6c-9b44ec3fd86c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.182404 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bf327880-91aa-47d8-8a6c-9b44ec3fd86c" (UID: "bf327880-91aa-47d8-8a6c-9b44ec3fd86c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.192012 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-scripts" (OuterVolumeSpecName: "scripts") pod "bf327880-91aa-47d8-8a6c-9b44ec3fd86c" (UID: "bf327880-91aa-47d8-8a6c-9b44ec3fd86c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.192641 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-kube-api-access-l28s4" (OuterVolumeSpecName: "kube-api-access-l28s4") pod "bf327880-91aa-47d8-8a6c-9b44ec3fd86c" (UID: "bf327880-91aa-47d8-8a6c-9b44ec3fd86c"). InnerVolumeSpecName "kube-api-access-l28s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.255368 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bf327880-91aa-47d8-8a6c-9b44ec3fd86c" (UID: "bf327880-91aa-47d8-8a6c-9b44ec3fd86c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.282879 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l28s4\" (UniqueName: \"kubernetes.io/projected/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-kube-api-access-l28s4\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.282924 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.282938 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.282949 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.282960 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.314173 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf327880-91aa-47d8-8a6c-9b44ec3fd86c" (UID: "bf327880-91aa-47d8-8a6c-9b44ec3fd86c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.337781 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f7b9cd85-4tf54"] Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.344014 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-config-data" (OuterVolumeSpecName: "config-data") pod "bf327880-91aa-47d8-8a6c-9b44ec3fd86c" (UID: "bf327880-91aa-47d8-8a6c-9b44ec3fd86c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.384966 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.384996 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf327880-91aa-47d8-8a6c-9b44ec3fd86c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.467998 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7b9cd85-4tf54" event={"ID":"fc02bf7e-2d67-40a4-94b0-5807631a5b2e","Type":"ContainerStarted","Data":"71486eafa8ee116f27520b440d4c2fe007efa0cf3a0d2c0fb82ad3f2d77810e5"} Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.470803 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf327880-91aa-47d8-8a6c-9b44ec3fd86c","Type":"ContainerDied","Data":"ace60c61865e2761c20361df902fa2b13d70f3547a00ffe848db1ad429a15036"} Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.470981 4789 scope.go:117] "RemoveContainer" containerID="b556dfe0b5cec3e2265012d23751e66482601957b63991c82cf1f1f9f278a236" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.471257 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.531425 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.548519 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.557704 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:25 crc kubenswrapper[4789]: E1216 07:13:25.558158 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-central-agent" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558181 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-central-agent" Dec 16 07:13:25 crc kubenswrapper[4789]: E1216 07:13:25.558196 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-notification-agent" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558204 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-notification-agent" Dec 16 07:13:25 crc kubenswrapper[4789]: E1216 07:13:25.558212 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="sg-core" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558217 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="sg-core" Dec 16 07:13:25 crc kubenswrapper[4789]: E1216 07:13:25.558239 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="proxy-httpd" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558245 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="proxy-httpd" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558435 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="proxy-httpd" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558454 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-notification-agent" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558479 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="sg-core" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.558493 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" containerName="ceilometer-central-agent" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.560106 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.567695 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.571933 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.574667 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.691929 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-scripts\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.692015 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpn6p\" (UniqueName: \"kubernetes.io/projected/2ff52d63-479a-4927-994c-53b6d9091ba6-kube-api-access-kpn6p\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.692071 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-log-httpd\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.692110 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.692266 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.692306 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-config-data\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.692355 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-run-httpd\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.696395 4789 scope.go:117] "RemoveContainer" containerID="61cee93c6ad5fb754fb2d1a17fe445e063006df0345da5c32bc08a1cebe830f7" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.726499 4789 scope.go:117] "RemoveContainer" containerID="9f0d9f99d3b06828a566d4f372b298f3a80e1ef7ec89935e786c589bc7ffc504" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.794253 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-scripts\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.794331 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpn6p\" (UniqueName: \"kubernetes.io/projected/2ff52d63-479a-4927-994c-53b6d9091ba6-kube-api-access-kpn6p\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.794378 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-log-httpd\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.794412 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.794464 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.794485 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-config-data\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.794513 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-run-httpd\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.795275 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-log-httpd\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.795599 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-run-httpd\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.798539 4789 scope.go:117] "RemoveContainer" containerID="cb9d68d7976543c4e65c36ed7dd1ec2471f3528f610140fbe2a0663fb1ea6c9d" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.798819 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-config-data\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.798966 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.799002 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.800178 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-scripts\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.818611 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpn6p\" (UniqueName: \"kubernetes.io/projected/2ff52d63-479a-4927-994c-53b6d9091ba6-kube-api-access-kpn6p\") pod \"ceilometer-0\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " pod="openstack/ceilometer-0" Dec 16 07:13:25 crc kubenswrapper[4789]: I1216 07:13:25.882032 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.126821 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf327880-91aa-47d8-8a6c-9b44ec3fd86c" path="/var/lib/kubelet/pods/bf327880-91aa-47d8-8a6c-9b44ec3fd86c/volumes" Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.137358 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.355318 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.485548 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerStarted","Data":"91b78f70f3a005672923a7d83395fcc6cb9cb2e60a380ca40bc6bd32ebd7fff5"} Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.488961 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7b9cd85-4tf54" event={"ID":"fc02bf7e-2d67-40a4-94b0-5807631a5b2e","Type":"ContainerStarted","Data":"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac"} Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.488994 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7b9cd85-4tf54" event={"ID":"fc02bf7e-2d67-40a4-94b0-5807631a5b2e","Type":"ContainerStarted","Data":"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0"} Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.490023 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.490051 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.491615 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978","Type":"ContainerStarted","Data":"f8d1a6dd5068981a14ce87c5e3a5413bba046f93f1f41541bf47903d89cd9291"} Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.510437 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7f7b9cd85-4tf54" podStartSLOduration=8.510417935 podStartE2EDuration="8.510417935s" podCreationTimestamp="2025-12-16 07:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:26.508317524 +0000 UTC m=+1344.770205173" watchObservedRunningTime="2025-12-16 07:13:26.510417935 +0000 UTC m=+1344.772305564" Dec 16 07:13:26 crc kubenswrapper[4789]: I1216 07:13:26.531476 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.726056572 podStartE2EDuration="13.531453966s" podCreationTimestamp="2025-12-16 07:13:13 +0000 UTC" firstStartedPulling="2025-12-16 07:13:14.116306711 +0000 UTC m=+1332.378194340" lastFinishedPulling="2025-12-16 07:13:24.921704105 +0000 UTC m=+1343.183591734" observedRunningTime="2025-12-16 07:13:26.528026743 +0000 UTC m=+1344.789914402" watchObservedRunningTime="2025-12-16 07:13:26.531453966 +0000 UTC m=+1344.793341595" Dec 16 07:13:27 crc kubenswrapper[4789]: I1216 07:13:27.501680 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerStarted","Data":"d40d3728be48806826df54a372704b786bac0f14ad9fe3bda9e372d1cf5e0062"} Dec 16 07:13:28 crc kubenswrapper[4789]: I1216 07:13:28.512444 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerStarted","Data":"067e0b15e2f727c7cf69899adca7bacb7a433fe3ad0da81c468ca79b389b8ed1"} Dec 16 07:13:29 crc kubenswrapper[4789]: I1216 07:13:29.523850 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerStarted","Data":"58f1dc035932bb08dc244f1eb46f56bdb726c4402485fa0d897c3bf64321356b"} Dec 16 07:13:29 crc kubenswrapper[4789]: I1216 07:13:29.554605 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:13:29 crc kubenswrapper[4789]: I1216 07:13:29.613072 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-584d7ccd9b-5jc2j"] Dec 16 07:13:29 crc kubenswrapper[4789]: I1216 07:13:29.613344 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-584d7ccd9b-5jc2j" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-api" containerID="cri-o://23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828" gracePeriod=30 Dec 16 07:13:29 crc kubenswrapper[4789]: I1216 07:13:29.613449 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-584d7ccd9b-5jc2j" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-httpd" containerID="cri-o://3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe" gracePeriod=30 Dec 16 07:13:31 crc kubenswrapper[4789]: I1216 07:13:31.541843 4789 generic.go:334] "Generic (PLEG): container finished" podID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerID="3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe" exitCode=0 Dec 16 07:13:31 crc kubenswrapper[4789]: I1216 07:13:31.541947 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584d7ccd9b-5jc2j" event={"ID":"9be1b65d-5e37-46d5-b7b8-ffd770eac023","Type":"ContainerDied","Data":"3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe"} Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.369149 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.464728 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-ovndb-tls-certs\") pod \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.466181 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-config\") pod \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.466413 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-combined-ca-bundle\") pod \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.466795 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-httpd-config\") pod \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.467118 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfchb\" (UniqueName: \"kubernetes.io/projected/9be1b65d-5e37-46d5-b7b8-ffd770eac023-kube-api-access-rfchb\") pod \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\" (UID: \"9be1b65d-5e37-46d5-b7b8-ffd770eac023\") " Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.473884 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be1b65d-5e37-46d5-b7b8-ffd770eac023-kube-api-access-rfchb" (OuterVolumeSpecName: "kube-api-access-rfchb") pod "9be1b65d-5e37-46d5-b7b8-ffd770eac023" (UID: "9be1b65d-5e37-46d5-b7b8-ffd770eac023"). InnerVolumeSpecName "kube-api-access-rfchb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.475642 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9be1b65d-5e37-46d5-b7b8-ffd770eac023" (UID: "9be1b65d-5e37-46d5-b7b8-ffd770eac023"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.516945 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-config" (OuterVolumeSpecName: "config") pod "9be1b65d-5e37-46d5-b7b8-ffd770eac023" (UID: "9be1b65d-5e37-46d5-b7b8-ffd770eac023"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.518564 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9be1b65d-5e37-46d5-b7b8-ffd770eac023" (UID: "9be1b65d-5e37-46d5-b7b8-ffd770eac023"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.553849 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9be1b65d-5e37-46d5-b7b8-ffd770eac023" (UID: "9be1b65d-5e37-46d5-b7b8-ffd770eac023"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.566857 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerStarted","Data":"30a4c952a805d31fc9d0d440137684c9fada6b7fc655b2769cd78b34e6579638"} Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.567153 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-central-agent" containerID="cri-o://d40d3728be48806826df54a372704b786bac0f14ad9fe3bda9e372d1cf5e0062" gracePeriod=30 Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.567434 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.567719 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="proxy-httpd" containerID="cri-o://30a4c952a805d31fc9d0d440137684c9fada6b7fc655b2769cd78b34e6579638" gracePeriod=30 Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.567772 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="sg-core" containerID="cri-o://58f1dc035932bb08dc244f1eb46f56bdb726c4402485fa0d897c3bf64321356b" gracePeriod=30 Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.567810 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-notification-agent" containerID="cri-o://067e0b15e2f727c7cf69899adca7bacb7a433fe3ad0da81c468ca79b389b8ed1" gracePeriod=30 Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.570241 4789 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.570263 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.570272 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.570280 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9be1b65d-5e37-46d5-b7b8-ffd770eac023-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.570289 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfchb\" (UniqueName: \"kubernetes.io/projected/9be1b65d-5e37-46d5-b7b8-ffd770eac023-kube-api-access-rfchb\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.574570 4789 generic.go:334] "Generic (PLEG): container finished" podID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerID="23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828" exitCode=0 Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.574617 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584d7ccd9b-5jc2j" event={"ID":"9be1b65d-5e37-46d5-b7b8-ffd770eac023","Type":"ContainerDied","Data":"23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828"} Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.574645 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584d7ccd9b-5jc2j" event={"ID":"9be1b65d-5e37-46d5-b7b8-ffd770eac023","Type":"ContainerDied","Data":"e6eda3cb332bfaf175685b026abb30c0bf34283ab73588e6da327cd9d7fe1e43"} Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.574663 4789 scope.go:117] "RemoveContainer" containerID="3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.574803 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584d7ccd9b-5jc2j" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.608109 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.145462291 podStartE2EDuration="8.608072208s" podCreationTimestamp="2025-12-16 07:13:25 +0000 UTC" firstStartedPulling="2025-12-16 07:13:26.35499688 +0000 UTC m=+1344.616884509" lastFinishedPulling="2025-12-16 07:13:32.817606797 +0000 UTC m=+1351.079494426" observedRunningTime="2025-12-16 07:13:33.596723143 +0000 UTC m=+1351.858610772" watchObservedRunningTime="2025-12-16 07:13:33.608072208 +0000 UTC m=+1351.869959837" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.614323 4789 scope.go:117] "RemoveContainer" containerID="23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.631836 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-584d7ccd9b-5jc2j"] Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.636724 4789 scope.go:117] "RemoveContainer" containerID="3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe" Dec 16 07:13:33 crc kubenswrapper[4789]: E1216 07:13:33.639067 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe\": container with ID starting with 3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe not found: ID does not exist" containerID="3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.639109 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe"} err="failed to get container status \"3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe\": rpc error: code = NotFound desc = could not find container \"3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe\": container with ID starting with 3a1add1108188d49cab67c68fa5e8e8379eac04592d18330ff96da459490d1fe not found: ID does not exist" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.639132 4789 scope.go:117] "RemoveContainer" containerID="23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828" Dec 16 07:13:33 crc kubenswrapper[4789]: E1216 07:13:33.639545 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828\": container with ID starting with 23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828 not found: ID does not exist" containerID="23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.639575 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828"} err="failed to get container status \"23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828\": rpc error: code = NotFound desc = could not find container \"23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828\": container with ID starting with 23e577e1c8c8df336dca4ea5bc52736db673ada38f3c38a19b4262428b034828 not found: ID does not exist" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.639585 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-584d7ccd9b-5jc2j"] Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.963303 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:33 crc kubenswrapper[4789]: I1216 07:13:33.964057 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.118571 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" path="/var/lib/kubelet/pods/9be1b65d-5e37-46d5-b7b8-ffd770eac023/volumes" Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.123663 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.124013 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-log" containerID="cri-o://3499b6119f9e2351dc1c72dc30e3b362986e2dc30e9b09646c3b984e0c3ed2a8" gracePeriod=30 Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.124217 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-httpd" containerID="cri-o://0df976f34b34ac9975bb7a34f8e6afbcddb2a22bcfd17a9c19f5b63f0b00cbda" gracePeriod=30 Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.585463 4789 generic.go:334] "Generic (PLEG): container finished" podID="bb49c501-7035-44f5-8db1-b88552df2500" containerID="3499b6119f9e2351dc1c72dc30e3b362986e2dc30e9b09646c3b984e0c3ed2a8" exitCode=143 Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.585540 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb49c501-7035-44f5-8db1-b88552df2500","Type":"ContainerDied","Data":"3499b6119f9e2351dc1c72dc30e3b362986e2dc30e9b09646c3b984e0c3ed2a8"} Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.590449 4789 generic.go:334] "Generic (PLEG): container finished" podID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerID="30a4c952a805d31fc9d0d440137684c9fada6b7fc655b2769cd78b34e6579638" exitCode=0 Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.590477 4789 generic.go:334] "Generic (PLEG): container finished" podID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerID="58f1dc035932bb08dc244f1eb46f56bdb726c4402485fa0d897c3bf64321356b" exitCode=2 Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.590485 4789 generic.go:334] "Generic (PLEG): container finished" podID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerID="067e0b15e2f727c7cf69899adca7bacb7a433fe3ad0da81c468ca79b389b8ed1" exitCode=0 Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.590539 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerDied","Data":"30a4c952a805d31fc9d0d440137684c9fada6b7fc655b2769cd78b34e6579638"} Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.590589 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerDied","Data":"58f1dc035932bb08dc244f1eb46f56bdb726c4402485fa0d897c3bf64321356b"} Dec 16 07:13:34 crc kubenswrapper[4789]: I1216 07:13:34.590601 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerDied","Data":"067e0b15e2f727c7cf69899adca7bacb7a433fe3ad0da81c468ca79b389b8ed1"} Dec 16 07:13:35 crc kubenswrapper[4789]: I1216 07:13:35.064809 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:13:35 crc kubenswrapper[4789]: I1216 07:13:35.065380 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-log" containerID="cri-o://61ee4f5f23aa653cfe8438c1bb953359d5aad0e0cc33f763adc23e8bb55b4325" gracePeriod=30 Dec 16 07:13:35 crc kubenswrapper[4789]: I1216 07:13:35.065464 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-httpd" containerID="cri-o://6b837c35d386080c7160b250bba62c98d1d6538584b87e84bdf27aa41539aeb3" gracePeriod=30 Dec 16 07:13:35 crc kubenswrapper[4789]: I1216 07:13:35.599238 4789 generic.go:334] "Generic (PLEG): container finished" podID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerID="61ee4f5f23aa653cfe8438c1bb953359d5aad0e0cc33f763adc23e8bb55b4325" exitCode=143 Dec 16 07:13:35 crc kubenswrapper[4789]: I1216 07:13:35.599278 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0633f9f-8819-4d01-8925-3c09e214c5f3","Type":"ContainerDied","Data":"61ee4f5f23aa653cfe8438c1bb953359d5aad0e0cc33f763adc23e8bb55b4325"} Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.620312 4789 generic.go:334] "Generic (PLEG): container finished" podID="bb49c501-7035-44f5-8db1-b88552df2500" containerID="0df976f34b34ac9975bb7a34f8e6afbcddb2a22bcfd17a9c19f5b63f0b00cbda" exitCode=0 Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.620383 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb49c501-7035-44f5-8db1-b88552df2500","Type":"ContainerDied","Data":"0df976f34b34ac9975bb7a34f8e6afbcddb2a22bcfd17a9c19f5b63f0b00cbda"} Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.877172 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.952003 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-httpd-run\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.952045 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-combined-ca-bundle\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.952145 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-scripts\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.952219 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-public-tls-certs\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.952246 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb64k\" (UniqueName: \"kubernetes.io/projected/bb49c501-7035-44f5-8db1-b88552df2500-kube-api-access-bb64k\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.952603 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.953112 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.953165 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-config-data\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.953197 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-logs\") pod \"bb49c501-7035-44f5-8db1-b88552df2500\" (UID: \"bb49c501-7035-44f5-8db1-b88552df2500\") " Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.953827 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.954480 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-logs" (OuterVolumeSpecName: "logs") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.959585 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-scripts" (OuterVolumeSpecName: "scripts") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.960615 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb49c501-7035-44f5-8db1-b88552df2500-kube-api-access-bb64k" (OuterVolumeSpecName: "kube-api-access-bb64k") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "kube-api-access-bb64k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.981233 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:13:37 crc kubenswrapper[4789]: I1216 07:13:37.988784 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.011272 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.022260 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-config-data" (OuterVolumeSpecName: "config-data") pod "bb49c501-7035-44f5-8db1-b88552df2500" (UID: "bb49c501-7035-44f5-8db1-b88552df2500"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.057231 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.057290 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.057306 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb64k\" (UniqueName: \"kubernetes.io/projected/bb49c501-7035-44f5-8db1-b88552df2500-kube-api-access-bb64k\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.057353 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.057370 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.057386 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb49c501-7035-44f5-8db1-b88552df2500-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.057399 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb49c501-7035-44f5-8db1-b88552df2500-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.083042 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.160279 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.632893 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb49c501-7035-44f5-8db1-b88552df2500","Type":"ContainerDied","Data":"da9c800d0fad1846bc7f03eb8cb6b0f6d90f7f01539b349d36fff38674a8a963"} Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.632968 4789 scope.go:117] "RemoveContainer" containerID="0df976f34b34ac9975bb7a34f8e6afbcddb2a22bcfd17a9c19f5b63f0b00cbda" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.633019 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.636373 4789 generic.go:334] "Generic (PLEG): container finished" podID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerID="6b837c35d386080c7160b250bba62c98d1d6538584b87e84bdf27aa41539aeb3" exitCode=0 Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.636420 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0633f9f-8819-4d01-8925-3c09e214c5f3","Type":"ContainerDied","Data":"6b837c35d386080c7160b250bba62c98d1d6538584b87e84bdf27aa41539aeb3"} Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.665593 4789 scope.go:117] "RemoveContainer" containerID="3499b6119f9e2351dc1c72dc30e3b362986e2dc30e9b09646c3b984e0c3ed2a8" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.682110 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.695322 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731061 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:13:38 crc kubenswrapper[4789]: E1216 07:13:38.731516 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-log" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731536 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-log" Dec 16 07:13:38 crc kubenswrapper[4789]: E1216 07:13:38.731552 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-httpd" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731560 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-httpd" Dec 16 07:13:38 crc kubenswrapper[4789]: E1216 07:13:38.731577 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-api" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731585 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-api" Dec 16 07:13:38 crc kubenswrapper[4789]: E1216 07:13:38.731595 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-httpd" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731602 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-httpd" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731829 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-api" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731862 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be1b65d-5e37-46d5-b7b8-ffd770eac023" containerName="neutron-httpd" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731875 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-httpd" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.731887 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb49c501-7035-44f5-8db1-b88552df2500" containerName="glance-log" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.732998 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.735806 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.738364 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.754802 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.874490 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs8vg\" (UniqueName: \"kubernetes.io/projected/a6423ab7-79a3-402c-9115-e54b5f29ad05-kube-api-access-rs8vg\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.874608 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.874662 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.874784 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.874833 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.874955 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.875102 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.875314 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-logs\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977414 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977481 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-logs\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977514 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs8vg\" (UniqueName: \"kubernetes.io/projected/a6423ab7-79a3-402c-9115-e54b5f29ad05-kube-api-access-rs8vg\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977549 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977566 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977605 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977622 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977687 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.977759 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.978116 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-logs\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.978173 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.982833 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.986046 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.991725 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:38 crc kubenswrapper[4789]: I1216 07:13:38.999514 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.002571 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs8vg\" (UniqueName: \"kubernetes.io/projected/a6423ab7-79a3-402c-9115-e54b5f29ad05-kube-api-access-rs8vg\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.012233 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " pod="openstack/glance-default-external-api-0" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.057552 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.566606 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.654101 4789 generic.go:334] "Generic (PLEG): container finished" podID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerID="d40d3728be48806826df54a372704b786bac0f14ad9fe3bda9e372d1cf5e0062" exitCode=0 Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.654187 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerDied","Data":"d40d3728be48806826df54a372704b786bac0f14ad9fe3bda9e372d1cf5e0062"} Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.671175 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b0633f9f-8819-4d01-8925-3c09e214c5f3","Type":"ContainerDied","Data":"1a1fe6c8c3f99ebed9d8e0ccae9aedd4686e5dc9ee14d536a2efb37443998eb7"} Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.671228 4789 scope.go:117] "RemoveContainer" containerID="6b837c35d386080c7160b250bba62c98d1d6538584b87e84bdf27aa41539aeb3" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.671348 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.695656 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-config-data\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.695776 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-combined-ca-bundle\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.695811 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-logs\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.695967 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-httpd-run\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.696028 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.696076 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-scripts\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.696102 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-internal-tls-certs\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.696141 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52z2f\" (UniqueName: \"kubernetes.io/projected/b0633f9f-8819-4d01-8925-3c09e214c5f3-kube-api-access-52z2f\") pod \"b0633f9f-8819-4d01-8925-3c09e214c5f3\" (UID: \"b0633f9f-8819-4d01-8925-3c09e214c5f3\") " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.697881 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.704356 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-logs" (OuterVolumeSpecName: "logs") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.707572 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0633f9f-8819-4d01-8925-3c09e214c5f3-kube-api-access-52z2f" (OuterVolumeSpecName: "kube-api-access-52z2f") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "kube-api-access-52z2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.712191 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.725310 4789 scope.go:117] "RemoveContainer" containerID="61ee4f5f23aa653cfe8438c1bb953359d5aad0e0cc33f763adc23e8bb55b4325" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.725616 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-scripts" (OuterVolumeSpecName: "scripts") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.781722 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.782504 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-config-data" (OuterVolumeSpecName: "config-data") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.797889 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.797943 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.797953 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.797962 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52z2f\" (UniqueName: \"kubernetes.io/projected/b0633f9f-8819-4d01-8925-3c09e214c5f3-kube-api-access-52z2f\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.797974 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.797981 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.797990 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0633f9f-8819-4d01-8925-3c09e214c5f3-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.799048 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.835784 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0633f9f-8819-4d01-8925-3c09e214c5f3" (UID: "b0633f9f-8819-4d01-8925-3c09e214c5f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.837904 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.860068 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.900361 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:39 crc kubenswrapper[4789]: I1216 07:13:39.900394 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0633f9f-8819-4d01-8925-3c09e214c5f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.001404 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-run-httpd\") pod \"2ff52d63-479a-4927-994c-53b6d9091ba6\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.001483 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-log-httpd\") pod \"2ff52d63-479a-4927-994c-53b6d9091ba6\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.001556 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpn6p\" (UniqueName: \"kubernetes.io/projected/2ff52d63-479a-4927-994c-53b6d9091ba6-kube-api-access-kpn6p\") pod \"2ff52d63-479a-4927-994c-53b6d9091ba6\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.001656 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-sg-core-conf-yaml\") pod \"2ff52d63-479a-4927-994c-53b6d9091ba6\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.001733 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-config-data\") pod \"2ff52d63-479a-4927-994c-53b6d9091ba6\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.001764 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-scripts\") pod \"2ff52d63-479a-4927-994c-53b6d9091ba6\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.001803 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-combined-ca-bundle\") pod \"2ff52d63-479a-4927-994c-53b6d9091ba6\" (UID: \"2ff52d63-479a-4927-994c-53b6d9091ba6\") " Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.004187 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ff52d63-479a-4927-994c-53b6d9091ba6" (UID: "2ff52d63-479a-4927-994c-53b6d9091ba6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.004444 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.005205 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ff52d63-479a-4927-994c-53b6d9091ba6" (UID: "2ff52d63-479a-4927-994c-53b6d9091ba6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.010726 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-scripts" (OuterVolumeSpecName: "scripts") pod "2ff52d63-479a-4927-994c-53b6d9091ba6" (UID: "2ff52d63-479a-4927-994c-53b6d9091ba6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.011138 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff52d63-479a-4927-994c-53b6d9091ba6-kube-api-access-kpn6p" (OuterVolumeSpecName: "kube-api-access-kpn6p") pod "2ff52d63-479a-4927-994c-53b6d9091ba6" (UID: "2ff52d63-479a-4927-994c-53b6d9091ba6"). InnerVolumeSpecName "kube-api-access-kpn6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.015326 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034250 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: E1216 07:13:40.034631 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="proxy-httpd" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034648 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="proxy-httpd" Dec 16 07:13:40 crc kubenswrapper[4789]: E1216 07:13:40.034680 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="sg-core" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034688 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="sg-core" Dec 16 07:13:40 crc kubenswrapper[4789]: E1216 07:13:40.034699 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-httpd" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034708 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-httpd" Dec 16 07:13:40 crc kubenswrapper[4789]: E1216 07:13:40.034728 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-log" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034736 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-log" Dec 16 07:13:40 crc kubenswrapper[4789]: E1216 07:13:40.034748 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-notification-agent" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034756 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-notification-agent" Dec 16 07:13:40 crc kubenswrapper[4789]: E1216 07:13:40.034772 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-central-agent" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034781 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-central-agent" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034958 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="proxy-httpd" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034974 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-notification-agent" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034987 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-httpd" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.034997 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" containerName="glance-log" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.035005 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="ceilometer-central-agent" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.035012 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" containerName="sg-core" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.035852 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.047558 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.062528 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.072951 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ff52d63-479a-4927-994c-53b6d9091ba6" (UID: "2ff52d63-479a-4927-994c-53b6d9091ba6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.073503 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103343 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103409 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103447 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103471 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103495 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103517 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103544 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9hf\" (UniqueName: \"kubernetes.io/projected/37216df1-3f61-412b-bffb-5e36812383f4-kube-api-access-cl9hf\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103585 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103635 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103650 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103659 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103668 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff52d63-479a-4927-994c-53b6d9091ba6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.103676 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpn6p\" (UniqueName: \"kubernetes.io/projected/2ff52d63-479a-4927-994c-53b6d9091ba6-kube-api-access-kpn6p\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.118797 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0633f9f-8819-4d01-8925-3c09e214c5f3" path="/var/lib/kubelet/pods/b0633f9f-8819-4d01-8925-3c09e214c5f3/volumes" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.119502 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb49c501-7035-44f5-8db1-b88552df2500" path="/var/lib/kubelet/pods/bb49c501-7035-44f5-8db1-b88552df2500/volumes" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.121292 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff52d63-479a-4927-994c-53b6d9091ba6" (UID: "2ff52d63-479a-4927-994c-53b6d9091ba6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.172757 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-config-data" (OuterVolumeSpecName: "config-data") pod "2ff52d63-479a-4927-994c-53b6d9091ba6" (UID: "2ff52d63-479a-4927-994c-53b6d9091ba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.210786 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.210870 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.210904 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.210960 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9hf\" (UniqueName: \"kubernetes.io/projected/37216df1-3f61-412b-bffb-5e36812383f4-kube-api-access-cl9hf\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.211048 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.211151 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.211223 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.211279 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.211343 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.211359 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff52d63-479a-4927-994c-53b6d9091ba6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.212242 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.215284 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.216751 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.220821 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.221691 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.221860 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.223196 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.243659 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9hf\" (UniqueName: \"kubernetes.io/projected/37216df1-3f61-412b-bffb-5e36812383f4-kube-api-access-cl9hf\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.270371 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.376744 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.696131 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6423ab7-79a3-402c-9115-e54b5f29ad05","Type":"ContainerStarted","Data":"e13e48820a043301c899b786263392f63ecec23d16cafd76439ce501fb5d2638"} Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.696538 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6423ab7-79a3-402c-9115-e54b5f29ad05","Type":"ContainerStarted","Data":"f88bdaeef19a48892d151efa3947bbd68842a623b72aee8fcce3361a02b0092e"} Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.701211 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ff52d63-479a-4927-994c-53b6d9091ba6","Type":"ContainerDied","Data":"91b78f70f3a005672923a7d83395fcc6cb9cb2e60a380ca40bc6bd32ebd7fff5"} Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.701348 4789 scope.go:117] "RemoveContainer" containerID="30a4c952a805d31fc9d0d440137684c9fada6b7fc655b2769cd78b34e6579638" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.701521 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.731642 4789 scope.go:117] "RemoveContainer" containerID="58f1dc035932bb08dc244f1eb46f56bdb726c4402485fa0d897c3bf64321356b" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.767077 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.768159 4789 scope.go:117] "RemoveContainer" containerID="067e0b15e2f727c7cf69899adca7bacb7a433fe3ad0da81c468ca79b389b8ed1" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.784472 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.797798 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.800332 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.802492 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.810439 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.816773 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.828467 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-scripts\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.828533 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-run-httpd\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.828595 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-config-data\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.828681 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcxnl\" (UniqueName: \"kubernetes.io/projected/10d05e27-889a-4225-b194-771ccd67b38c-kube-api-access-rcxnl\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.828708 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.828743 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-log-httpd\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.828766 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.831996 4789 scope.go:117] "RemoveContainer" containerID="d40d3728be48806826df54a372704b786bac0f14ad9fe3bda9e372d1cf5e0062" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.930535 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-run-httpd\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.930590 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-config-data\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.930632 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcxnl\" (UniqueName: \"kubernetes.io/projected/10d05e27-889a-4225-b194-771ccd67b38c-kube-api-access-rcxnl\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.930656 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.930692 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-log-httpd\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.930713 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.930756 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-scripts\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.932584 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-log-httpd\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.932668 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-run-httpd\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.935938 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.937118 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-scripts\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.937976 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.938297 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-config-data\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.947334 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcxnl\" (UniqueName: \"kubernetes.io/projected/10d05e27-889a-4225-b194-771ccd67b38c-kube-api-access-rcxnl\") pod \"ceilometer-0\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " pod="openstack/ceilometer-0" Dec 16 07:13:40 crc kubenswrapper[4789]: I1216 07:13:40.997896 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:13:41 crc kubenswrapper[4789]: W1216 07:13:41.008391 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37216df1_3f61_412b_bffb_5e36812383f4.slice/crio-b9db1e248cd008608e0c0683297e0cb2bb75ce3f9381a556ee2dccfb2dfb1a3c WatchSource:0}: Error finding container b9db1e248cd008608e0c0683297e0cb2bb75ce3f9381a556ee2dccfb2dfb1a3c: Status 404 returned error can't find the container with id b9db1e248cd008608e0c0683297e0cb2bb75ce3f9381a556ee2dccfb2dfb1a3c Dec 16 07:13:41 crc kubenswrapper[4789]: I1216 07:13:41.127601 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:41 crc kubenswrapper[4789]: I1216 07:13:41.495212 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:41 crc kubenswrapper[4789]: I1216 07:13:41.651192 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:41 crc kubenswrapper[4789]: W1216 07:13:41.653335 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d05e27_889a_4225_b194_771ccd67b38c.slice/crio-862a597d36230daa058271baa6aa6fed90b6f780600ee2c00d3ede1a36f28a6e WatchSource:0}: Error finding container 862a597d36230daa058271baa6aa6fed90b6f780600ee2c00d3ede1a36f28a6e: Status 404 returned error can't find the container with id 862a597d36230daa058271baa6aa6fed90b6f780600ee2c00d3ede1a36f28a6e Dec 16 07:13:41 crc kubenswrapper[4789]: I1216 07:13:41.711189 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37216df1-3f61-412b-bffb-5e36812383f4","Type":"ContainerStarted","Data":"b9db1e248cd008608e0c0683297e0cb2bb75ce3f9381a556ee2dccfb2dfb1a3c"} Dec 16 07:13:41 crc kubenswrapper[4789]: I1216 07:13:41.713808 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerStarted","Data":"862a597d36230daa058271baa6aa6fed90b6f780600ee2c00d3ede1a36f28a6e"} Dec 16 07:13:41 crc kubenswrapper[4789]: I1216 07:13:41.715358 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6423ab7-79a3-402c-9115-e54b5f29ad05","Type":"ContainerStarted","Data":"79b3fcff6d02b1d1105cdaa7d49563ca416afc3d0d209ae94eae9fd336eca759"} Dec 16 07:13:41 crc kubenswrapper[4789]: I1216 07:13:41.738006 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.7379893749999997 podStartE2EDuration="3.737989375s" podCreationTimestamp="2025-12-16 07:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:41.735925225 +0000 UTC m=+1359.997812874" watchObservedRunningTime="2025-12-16 07:13:41.737989375 +0000 UTC m=+1359.999877004" Dec 16 07:13:42 crc kubenswrapper[4789]: I1216 07:13:42.120147 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff52d63-479a-4927-994c-53b6d9091ba6" path="/var/lib/kubelet/pods/2ff52d63-479a-4927-994c-53b6d9091ba6/volumes" Dec 16 07:13:42 crc kubenswrapper[4789]: I1216 07:13:42.728575 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37216df1-3f61-412b-bffb-5e36812383f4","Type":"ContainerStarted","Data":"82b67d2f0b7d827d390a1737f28832c66819bfb58a92aae85e467319754e80a4"} Dec 16 07:13:42 crc kubenswrapper[4789]: I1216 07:13:42.728961 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37216df1-3f61-412b-bffb-5e36812383f4","Type":"ContainerStarted","Data":"1ca313e4e286bdf363a60db146512417c224d3addedf2f21605dd96befee2ec7"} Dec 16 07:13:42 crc kubenswrapper[4789]: I1216 07:13:42.755493 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.75547399 podStartE2EDuration="2.75547399s" podCreationTimestamp="2025-12-16 07:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:42.750067218 +0000 UTC m=+1361.011954867" watchObservedRunningTime="2025-12-16 07:13:42.75547399 +0000 UTC m=+1361.017361619" Dec 16 07:13:43 crc kubenswrapper[4789]: I1216 07:13:43.737734 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerStarted","Data":"b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6"} Dec 16 07:13:44 crc kubenswrapper[4789]: I1216 07:13:44.751213 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerStarted","Data":"759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada"} Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.226326 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-77l4n"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.227679 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.237302 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-77l4n"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.334818 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-76kjb"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.336487 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.342861 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-76kjb"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.369070 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f20b8e31-ab07-411d-844b-f69077acbe95-operator-scripts\") pod \"nova-api-db-create-77l4n\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.369154 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nc4p\" (UniqueName: \"kubernetes.io/projected/f20b8e31-ab07-411d-844b-f69077acbe95-kube-api-access-5nc4p\") pod \"nova-api-db-create-77l4n\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.434019 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fc07-account-create-update-zxl2v"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.435377 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.440164 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.458929 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fc07-account-create-update-zxl2v"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.470980 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nc4p\" (UniqueName: \"kubernetes.io/projected/f20b8e31-ab07-411d-844b-f69077acbe95-kube-api-access-5nc4p\") pod \"nova-api-db-create-77l4n\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.471044 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-operator-scripts\") pod \"nova-cell0-db-create-76kjb\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.471189 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64hrt\" (UniqueName: \"kubernetes.io/projected/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-kube-api-access-64hrt\") pod \"nova-cell0-db-create-76kjb\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.471237 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f20b8e31-ab07-411d-844b-f69077acbe95-operator-scripts\") pod \"nova-api-db-create-77l4n\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.471987 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f20b8e31-ab07-411d-844b-f69077acbe95-operator-scripts\") pod \"nova-api-db-create-77l4n\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.491507 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nc4p\" (UniqueName: \"kubernetes.io/projected/f20b8e31-ab07-411d-844b-f69077acbe95-kube-api-access-5nc4p\") pod \"nova-api-db-create-77l4n\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.536288 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jfvcn"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.537531 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.555957 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jfvcn"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.572891 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64hrt\" (UniqueName: \"kubernetes.io/projected/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-kube-api-access-64hrt\") pod \"nova-cell0-db-create-76kjb\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.573034 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7f1d157-db58-4392-911c-344fcc5a8ce1-operator-scripts\") pod \"nova-api-fc07-account-create-update-zxl2v\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.573068 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-operator-scripts\") pod \"nova-cell0-db-create-76kjb\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.573098 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sljj\" (UniqueName: \"kubernetes.io/projected/a7f1d157-db58-4392-911c-344fcc5a8ce1-kube-api-access-2sljj\") pod \"nova-api-fc07-account-create-update-zxl2v\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.573821 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-operator-scripts\") pod \"nova-cell0-db-create-76kjb\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.592821 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64hrt\" (UniqueName: \"kubernetes.io/projected/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-kube-api-access-64hrt\") pod \"nova-cell0-db-create-76kjb\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.603399 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.649969 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4bcc-account-create-update-xk8nk"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.651106 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.652229 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.655491 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.670339 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4bcc-account-create-update-xk8nk"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.680354 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7f1d157-db58-4392-911c-344fcc5a8ce1-operator-scripts\") pod \"nova-api-fc07-account-create-update-zxl2v\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.680435 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sljj\" (UniqueName: \"kubernetes.io/projected/a7f1d157-db58-4392-911c-344fcc5a8ce1-kube-api-access-2sljj\") pod \"nova-api-fc07-account-create-update-zxl2v\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.680562 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-operator-scripts\") pod \"nova-cell1-db-create-jfvcn\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.680590 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzk7t\" (UniqueName: \"kubernetes.io/projected/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-kube-api-access-pzk7t\") pod \"nova-cell1-db-create-jfvcn\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.684722 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7f1d157-db58-4392-911c-344fcc5a8ce1-operator-scripts\") pod \"nova-api-fc07-account-create-update-zxl2v\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.698701 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sljj\" (UniqueName: \"kubernetes.io/projected/a7f1d157-db58-4392-911c-344fcc5a8ce1-kube-api-access-2sljj\") pod \"nova-api-fc07-account-create-update-zxl2v\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.760300 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.773512 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerStarted","Data":"25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f"} Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.782158 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-operator-scripts\") pod \"nova-cell0-4bcc-account-create-update-xk8nk\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.782277 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfn6j\" (UniqueName: \"kubernetes.io/projected/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-kube-api-access-sfn6j\") pod \"nova-cell0-4bcc-account-create-update-xk8nk\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.782360 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-operator-scripts\") pod \"nova-cell1-db-create-jfvcn\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.782386 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzk7t\" (UniqueName: \"kubernetes.io/projected/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-kube-api-access-pzk7t\") pod \"nova-cell1-db-create-jfvcn\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.783154 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-operator-scripts\") pod \"nova-cell1-db-create-jfvcn\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.804903 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzk7t\" (UniqueName: \"kubernetes.io/projected/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-kube-api-access-pzk7t\") pod \"nova-cell1-db-create-jfvcn\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.852123 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e2f9-account-create-update-hjzt7"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.853435 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.854986 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.855701 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.867529 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e2f9-account-create-update-hjzt7"] Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.884686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-operator-scripts\") pod \"nova-cell0-4bcc-account-create-update-xk8nk\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.885153 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfn6j\" (UniqueName: \"kubernetes.io/projected/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-kube-api-access-sfn6j\") pod \"nova-cell0-4bcc-account-create-update-xk8nk\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.886070 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-operator-scripts\") pod \"nova-cell0-4bcc-account-create-update-xk8nk\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.908466 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfn6j\" (UniqueName: \"kubernetes.io/projected/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-kube-api-access-sfn6j\") pod \"nova-cell0-4bcc-account-create-update-xk8nk\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.987530 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6frh\" (UniqueName: \"kubernetes.io/projected/ce6e332d-a38d-4ec6-b875-aad75c5491f4-kube-api-access-f6frh\") pod \"nova-cell1-e2f9-account-create-update-hjzt7\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:45 crc kubenswrapper[4789]: I1216 07:13:45.987661 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce6e332d-a38d-4ec6-b875-aad75c5491f4-operator-scripts\") pod \"nova-cell1-e2f9-account-create-update-hjzt7\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.086024 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.089526 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce6e332d-a38d-4ec6-b875-aad75c5491f4-operator-scripts\") pod \"nova-cell1-e2f9-account-create-update-hjzt7\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.089712 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6frh\" (UniqueName: \"kubernetes.io/projected/ce6e332d-a38d-4ec6-b875-aad75c5491f4-kube-api-access-f6frh\") pod \"nova-cell1-e2f9-account-create-update-hjzt7\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.090414 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce6e332d-a38d-4ec6-b875-aad75c5491f4-operator-scripts\") pod \"nova-cell1-e2f9-account-create-update-hjzt7\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.111758 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6frh\" (UniqueName: \"kubernetes.io/projected/ce6e332d-a38d-4ec6-b875-aad75c5491f4-kube-api-access-f6frh\") pod \"nova-cell1-e2f9-account-create-update-hjzt7\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.175081 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-76kjb"] Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.183735 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:46 crc kubenswrapper[4789]: W1216 07:13:46.186078 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf20b8e31_ab07_411d_844b_f69077acbe95.slice/crio-a5e6101821c85c5715dce4d3e99bd2679e455d13dc823fa1e90e08bf181eb447 WatchSource:0}: Error finding container a5e6101821c85c5715dce4d3e99bd2679e455d13dc823fa1e90e08bf181eb447: Status 404 returned error can't find the container with id a5e6101821c85c5715dce4d3e99bd2679e455d13dc823fa1e90e08bf181eb447 Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.194103 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-77l4n"] Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.423292 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fc07-account-create-update-zxl2v"] Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.501816 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jfvcn"] Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.621075 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4bcc-account-create-update-xk8nk"] Dec 16 07:13:46 crc kubenswrapper[4789]: W1216 07:13:46.640380 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7f8aee7_df50_4d02_bdc6_a0feacc6868b.slice/crio-12ca51ca92ff09a96fb569f36b6593a5ddbc795b58f3a9b8ec867cc1b7e6bb30 WatchSource:0}: Error finding container 12ca51ca92ff09a96fb569f36b6593a5ddbc795b58f3a9b8ec867cc1b7e6bb30: Status 404 returned error can't find the container with id 12ca51ca92ff09a96fb569f36b6593a5ddbc795b58f3a9b8ec867cc1b7e6bb30 Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.790580 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jfvcn" event={"ID":"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826","Type":"ContainerStarted","Data":"215a92437ecc029b6769ad208ba95433d7e5c29c4cbeebf262f9e70ea5d8b3e6"} Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.798296 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fc07-account-create-update-zxl2v" event={"ID":"a7f1d157-db58-4392-911c-344fcc5a8ce1","Type":"ContainerStarted","Data":"6f5597a472a5d0ef8a1adb47d6d2717e7f6b18b43098358b7872371182ffd9ac"} Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.800837 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76kjb" event={"ID":"7b2c4ef3-c9dd-497a-b092-d257ed4ef992","Type":"ContainerStarted","Data":"81df070f3270ac471d25411419bf933d40f0a828bdcac35dc7081c9758497a34"} Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.800887 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76kjb" event={"ID":"7b2c4ef3-c9dd-497a-b092-d257ed4ef992","Type":"ContainerStarted","Data":"11583216bc00d15499c2c580f37449336c4c1f6cd133b29f60e6bc9c67a863bf"} Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.808350 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" event={"ID":"d7f8aee7-df50-4d02-bdc6-a0feacc6868b","Type":"ContainerStarted","Data":"12ca51ca92ff09a96fb569f36b6593a5ddbc795b58f3a9b8ec867cc1b7e6bb30"} Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.818181 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-77l4n" event={"ID":"f20b8e31-ab07-411d-844b-f69077acbe95","Type":"ContainerStarted","Data":"a5e6101821c85c5715dce4d3e99bd2679e455d13dc823fa1e90e08bf181eb447"} Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.824050 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-76kjb" podStartSLOduration=1.824033766 podStartE2EDuration="1.824033766s" podCreationTimestamp="2025-12-16 07:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:13:46.822306424 +0000 UTC m=+1365.084194073" watchObservedRunningTime="2025-12-16 07:13:46.824033766 +0000 UTC m=+1365.085921395" Dec 16 07:13:46 crc kubenswrapper[4789]: I1216 07:13:46.912162 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e2f9-account-create-update-hjzt7"] Dec 16 07:13:46 crc kubenswrapper[4789]: W1216 07:13:46.951905 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce6e332d_a38d_4ec6_b875_aad75c5491f4.slice/crio-1b0a4af8adad64a3a6448d341e61d07fdf267fa5a8232ad5ade3da04b7f84b9b WatchSource:0}: Error finding container 1b0a4af8adad64a3a6448d341e61d07fdf267fa5a8232ad5ade3da04b7f84b9b: Status 404 returned error can't find the container with id 1b0a4af8adad64a3a6448d341e61d07fdf267fa5a8232ad5ade3da04b7f84b9b Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.832835 4789 generic.go:334] "Generic (PLEG): container finished" podID="5cfbba1f-9f1d-4994-b831-e6fd0d7d9826" containerID="5507caaa355e89c063f63a2cefa09e35c12ab6eb6ce0b04a31fa77618d0f85bd" exitCode=0 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.832901 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jfvcn" event={"ID":"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826","Type":"ContainerDied","Data":"5507caaa355e89c063f63a2cefa09e35c12ab6eb6ce0b04a31fa77618d0f85bd"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.841029 4789 generic.go:334] "Generic (PLEG): container finished" podID="a7f1d157-db58-4392-911c-344fcc5a8ce1" containerID="3549bc0cc1314556e102d5bb9c5e370800c3e9c50b9cb85c48387cd83a095e2c" exitCode=0 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.841093 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fc07-account-create-update-zxl2v" event={"ID":"a7f1d157-db58-4392-911c-344fcc5a8ce1","Type":"ContainerDied","Data":"3549bc0cc1314556e102d5bb9c5e370800c3e9c50b9cb85c48387cd83a095e2c"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.847725 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerStarted","Data":"39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.847849 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-central-agent" containerID="cri-o://b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6" gracePeriod=30 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.847931 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.847983 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="sg-core" containerID="cri-o://25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f" gracePeriod=30 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.847992 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-notification-agent" containerID="cri-o://759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada" gracePeriod=30 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.848050 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="proxy-httpd" containerID="cri-o://39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337" gracePeriod=30 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.855229 4789 generic.go:334] "Generic (PLEG): container finished" podID="7b2c4ef3-c9dd-497a-b092-d257ed4ef992" containerID="81df070f3270ac471d25411419bf933d40f0a828bdcac35dc7081c9758497a34" exitCode=0 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.855290 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76kjb" event={"ID":"7b2c4ef3-c9dd-497a-b092-d257ed4ef992","Type":"ContainerDied","Data":"81df070f3270ac471d25411419bf933d40f0a828bdcac35dc7081c9758497a34"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.863397 4789 generic.go:334] "Generic (PLEG): container finished" podID="d7f8aee7-df50-4d02-bdc6-a0feacc6868b" containerID="23e575aa3f3085bd2b26af1b9af05f6c37b3a79d95b825164e72059a6443cf01" exitCode=0 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.863559 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" event={"ID":"d7f8aee7-df50-4d02-bdc6-a0feacc6868b","Type":"ContainerDied","Data":"23e575aa3f3085bd2b26af1b9af05f6c37b3a79d95b825164e72059a6443cf01"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.868452 4789 generic.go:334] "Generic (PLEG): container finished" podID="ce6e332d-a38d-4ec6-b875-aad75c5491f4" containerID="2d1bbeab0e372abd0b616ae4a4940235c89a48f444bb64d8b567503ce48488cb" exitCode=0 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.868572 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" event={"ID":"ce6e332d-a38d-4ec6-b875-aad75c5491f4","Type":"ContainerDied","Data":"2d1bbeab0e372abd0b616ae4a4940235c89a48f444bb64d8b567503ce48488cb"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.868611 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" event={"ID":"ce6e332d-a38d-4ec6-b875-aad75c5491f4","Type":"ContainerStarted","Data":"1b0a4af8adad64a3a6448d341e61d07fdf267fa5a8232ad5ade3da04b7f84b9b"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.872690 4789 generic.go:334] "Generic (PLEG): container finished" podID="f20b8e31-ab07-411d-844b-f69077acbe95" containerID="0b13c83f5ec120dc4feea18f65f1c3f02935a2d38f0813d1ecc61531e9b5b8f2" exitCode=0 Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.872746 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-77l4n" event={"ID":"f20b8e31-ab07-411d-844b-f69077acbe95","Type":"ContainerDied","Data":"0b13c83f5ec120dc4feea18f65f1c3f02935a2d38f0813d1ecc61531e9b5b8f2"} Dec 16 07:13:47 crc kubenswrapper[4789]: I1216 07:13:47.899899 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.844737848 podStartE2EDuration="7.899882068s" podCreationTimestamp="2025-12-16 07:13:40 +0000 UTC" firstStartedPulling="2025-12-16 07:13:41.656176018 +0000 UTC m=+1359.918063647" lastFinishedPulling="2025-12-16 07:13:46.711320238 +0000 UTC m=+1364.973207867" observedRunningTime="2025-12-16 07:13:47.89422244 +0000 UTC m=+1366.156110099" watchObservedRunningTime="2025-12-16 07:13:47.899882068 +0000 UTC m=+1366.161769697" Dec 16 07:13:48 crc kubenswrapper[4789]: I1216 07:13:48.885129 4789 generic.go:334] "Generic (PLEG): container finished" podID="10d05e27-889a-4225-b194-771ccd67b38c" containerID="39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337" exitCode=0 Dec 16 07:13:48 crc kubenswrapper[4789]: I1216 07:13:48.885159 4789 generic.go:334] "Generic (PLEG): container finished" podID="10d05e27-889a-4225-b194-771ccd67b38c" containerID="25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f" exitCode=2 Dec 16 07:13:48 crc kubenswrapper[4789]: I1216 07:13:48.885167 4789 generic.go:334] "Generic (PLEG): container finished" podID="10d05e27-889a-4225-b194-771ccd67b38c" containerID="759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada" exitCode=0 Dec 16 07:13:48 crc kubenswrapper[4789]: I1216 07:13:48.885188 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerDied","Data":"39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337"} Dec 16 07:13:48 crc kubenswrapper[4789]: I1216 07:13:48.885232 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerDied","Data":"25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f"} Dec 16 07:13:48 crc kubenswrapper[4789]: I1216 07:13:48.885247 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerDied","Data":"759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada"} Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.058058 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.058106 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.090344 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.110611 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.295088 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.467014 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sljj\" (UniqueName: \"kubernetes.io/projected/a7f1d157-db58-4392-911c-344fcc5a8ce1-kube-api-access-2sljj\") pod \"a7f1d157-db58-4392-911c-344fcc5a8ce1\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.467369 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7f1d157-db58-4392-911c-344fcc5a8ce1-operator-scripts\") pod \"a7f1d157-db58-4392-911c-344fcc5a8ce1\" (UID: \"a7f1d157-db58-4392-911c-344fcc5a8ce1\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.468870 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7f1d157-db58-4392-911c-344fcc5a8ce1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7f1d157-db58-4392-911c-344fcc5a8ce1" (UID: "a7f1d157-db58-4392-911c-344fcc5a8ce1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.478881 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f1d157-db58-4392-911c-344fcc5a8ce1-kube-api-access-2sljj" (OuterVolumeSpecName: "kube-api-access-2sljj") pod "a7f1d157-db58-4392-911c-344fcc5a8ce1" (UID: "a7f1d157-db58-4392-911c-344fcc5a8ce1"). InnerVolumeSpecName "kube-api-access-2sljj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.569415 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7f1d157-db58-4392-911c-344fcc5a8ce1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.569451 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sljj\" (UniqueName: \"kubernetes.io/projected/a7f1d157-db58-4392-911c-344fcc5a8ce1-kube-api-access-2sljj\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.615891 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.622364 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.629408 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.674973 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.687736 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.775506 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce6e332d-a38d-4ec6-b875-aad75c5491f4-operator-scripts\") pod \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.775590 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f20b8e31-ab07-411d-844b-f69077acbe95-operator-scripts\") pod \"f20b8e31-ab07-411d-844b-f69077acbe95\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.775638 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-operator-scripts\") pod \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.775834 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfn6j\" (UniqueName: \"kubernetes.io/projected/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-kube-api-access-sfn6j\") pod \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.775864 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-operator-scripts\") pod \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\" (UID: \"d7f8aee7-df50-4d02-bdc6-a0feacc6868b\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.775954 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzk7t\" (UniqueName: \"kubernetes.io/projected/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-kube-api-access-pzk7t\") pod \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\" (UID: \"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.776003 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nc4p\" (UniqueName: \"kubernetes.io/projected/f20b8e31-ab07-411d-844b-f69077acbe95-kube-api-access-5nc4p\") pod \"f20b8e31-ab07-411d-844b-f69077acbe95\" (UID: \"f20b8e31-ab07-411d-844b-f69077acbe95\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.776094 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6frh\" (UniqueName: \"kubernetes.io/projected/ce6e332d-a38d-4ec6-b875-aad75c5491f4-kube-api-access-f6frh\") pod \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\" (UID: \"ce6e332d-a38d-4ec6-b875-aad75c5491f4\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.776267 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce6e332d-a38d-4ec6-b875-aad75c5491f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce6e332d-a38d-4ec6-b875-aad75c5491f4" (UID: "ce6e332d-a38d-4ec6-b875-aad75c5491f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.776540 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5cfbba1f-9f1d-4994-b831-e6fd0d7d9826" (UID: "5cfbba1f-9f1d-4994-b831-e6fd0d7d9826"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.776573 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce6e332d-a38d-4ec6-b875-aad75c5491f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.776629 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7f8aee7-df50-4d02-bdc6-a0feacc6868b" (UID: "d7f8aee7-df50-4d02-bdc6-a0feacc6868b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.776677 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f20b8e31-ab07-411d-844b-f69077acbe95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f20b8e31-ab07-411d-844b-f69077acbe95" (UID: "f20b8e31-ab07-411d-844b-f69077acbe95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.780342 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-kube-api-access-sfn6j" (OuterVolumeSpecName: "kube-api-access-sfn6j") pod "d7f8aee7-df50-4d02-bdc6-a0feacc6868b" (UID: "d7f8aee7-df50-4d02-bdc6-a0feacc6868b"). InnerVolumeSpecName "kube-api-access-sfn6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.781021 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6e332d-a38d-4ec6-b875-aad75c5491f4-kube-api-access-f6frh" (OuterVolumeSpecName: "kube-api-access-f6frh") pod "ce6e332d-a38d-4ec6-b875-aad75c5491f4" (UID: "ce6e332d-a38d-4ec6-b875-aad75c5491f4"). InnerVolumeSpecName "kube-api-access-f6frh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.781091 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-kube-api-access-pzk7t" (OuterVolumeSpecName: "kube-api-access-pzk7t") pod "5cfbba1f-9f1d-4994-b831-e6fd0d7d9826" (UID: "5cfbba1f-9f1d-4994-b831-e6fd0d7d9826"). InnerVolumeSpecName "kube-api-access-pzk7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.782120 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20b8e31-ab07-411d-844b-f69077acbe95-kube-api-access-5nc4p" (OuterVolumeSpecName: "kube-api-access-5nc4p") pod "f20b8e31-ab07-411d-844b-f69077acbe95" (UID: "f20b8e31-ab07-411d-844b-f69077acbe95"). InnerVolumeSpecName "kube-api-access-5nc4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.877489 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64hrt\" (UniqueName: \"kubernetes.io/projected/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-kube-api-access-64hrt\") pod \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878029 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-operator-scripts\") pod \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\" (UID: \"7b2c4ef3-c9dd-497a-b092-d257ed4ef992\") " Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878436 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfn6j\" (UniqueName: \"kubernetes.io/projected/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-kube-api-access-sfn6j\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878457 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7f8aee7-df50-4d02-bdc6-a0feacc6868b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878466 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzk7t\" (UniqueName: \"kubernetes.io/projected/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-kube-api-access-pzk7t\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878474 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nc4p\" (UniqueName: \"kubernetes.io/projected/f20b8e31-ab07-411d-844b-f69077acbe95-kube-api-access-5nc4p\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878484 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6frh\" (UniqueName: \"kubernetes.io/projected/ce6e332d-a38d-4ec6-b875-aad75c5491f4-kube-api-access-f6frh\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878492 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f20b8e31-ab07-411d-844b-f69077acbe95-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.878500 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.879478 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b2c4ef3-c9dd-497a-b092-d257ed4ef992" (UID: "7b2c4ef3-c9dd-497a-b092-d257ed4ef992"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.882689 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-kube-api-access-64hrt" (OuterVolumeSpecName: "kube-api-access-64hrt") pod "7b2c4ef3-c9dd-497a-b092-d257ed4ef992" (UID: "7b2c4ef3-c9dd-497a-b092-d257ed4ef992"). InnerVolumeSpecName "kube-api-access-64hrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.896891 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fc07-account-create-update-zxl2v" event={"ID":"a7f1d157-db58-4392-911c-344fcc5a8ce1","Type":"ContainerDied","Data":"6f5597a472a5d0ef8a1adb47d6d2717e7f6b18b43098358b7872371182ffd9ac"} Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.896977 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f5597a472a5d0ef8a1adb47d6d2717e7f6b18b43098358b7872371182ffd9ac" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.896952 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fc07-account-create-update-zxl2v" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.899990 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-76kjb" event={"ID":"7b2c4ef3-c9dd-497a-b092-d257ed4ef992","Type":"ContainerDied","Data":"11583216bc00d15499c2c580f37449336c4c1f6cd133b29f60e6bc9c67a863bf"} Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.900039 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11583216bc00d15499c2c580f37449336c4c1f6cd133b29f60e6bc9c67a863bf" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.900061 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-76kjb" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.903812 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.903994 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4bcc-account-create-update-xk8nk" event={"ID":"d7f8aee7-df50-4d02-bdc6-a0feacc6868b","Type":"ContainerDied","Data":"12ca51ca92ff09a96fb569f36b6593a5ddbc795b58f3a9b8ec867cc1b7e6bb30"} Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.904048 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ca51ca92ff09a96fb569f36b6593a5ddbc795b58f3a9b8ec867cc1b7e6bb30" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.905472 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" event={"ID":"ce6e332d-a38d-4ec6-b875-aad75c5491f4","Type":"ContainerDied","Data":"1b0a4af8adad64a3a6448d341e61d07fdf267fa5a8232ad5ade3da04b7f84b9b"} Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.905487 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e2f9-account-create-update-hjzt7" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.905499 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0a4af8adad64a3a6448d341e61d07fdf267fa5a8232ad5ade3da04b7f84b9b" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.906544 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-77l4n" event={"ID":"f20b8e31-ab07-411d-844b-f69077acbe95","Type":"ContainerDied","Data":"a5e6101821c85c5715dce4d3e99bd2679e455d13dc823fa1e90e08bf181eb447"} Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.906561 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5e6101821c85c5715dce4d3e99bd2679e455d13dc823fa1e90e08bf181eb447" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.906823 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77l4n" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.907963 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jfvcn" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.907979 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jfvcn" event={"ID":"5cfbba1f-9f1d-4994-b831-e6fd0d7d9826","Type":"ContainerDied","Data":"215a92437ecc029b6769ad208ba95433d7e5c29c4cbeebf262f9e70ea5d8b3e6"} Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.908006 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="215a92437ecc029b6769ad208ba95433d7e5c29c4cbeebf262f9e70ea5d8b3e6" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.908451 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.908494 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.980337 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64hrt\" (UniqueName: \"kubernetes.io/projected/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-kube-api-access-64hrt\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:49 crc kubenswrapper[4789]: I1216 07:13:49.980386 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b2c4ef3-c9dd-497a-b092-d257ed4ef992-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.377141 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.377191 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.406593 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.419586 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.916559 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.916618 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.962130 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2ln4f"] Dec 16 07:13:50 crc kubenswrapper[4789]: E1216 07:13:50.962569 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6e332d-a38d-4ec6-b875-aad75c5491f4" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.962592 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6e332d-a38d-4ec6-b875-aad75c5491f4" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: E1216 07:13:50.962611 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f1d157-db58-4392-911c-344fcc5a8ce1" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.962623 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f1d157-db58-4392-911c-344fcc5a8ce1" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: E1216 07:13:50.962641 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfbba1f-9f1d-4994-b831-e6fd0d7d9826" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.962651 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfbba1f-9f1d-4994-b831-e6fd0d7d9826" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: E1216 07:13:50.962667 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2c4ef3-c9dd-497a-b092-d257ed4ef992" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.962674 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2c4ef3-c9dd-497a-b092-d257ed4ef992" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: E1216 07:13:50.962699 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f8aee7-df50-4d02-bdc6-a0feacc6868b" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.962709 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f8aee7-df50-4d02-bdc6-a0feacc6868b" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: E1216 07:13:50.962757 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20b8e31-ab07-411d-844b-f69077acbe95" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.962766 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20b8e31-ab07-411d-844b-f69077acbe95" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.963035 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6e332d-a38d-4ec6-b875-aad75c5491f4" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.963059 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f8aee7-df50-4d02-bdc6-a0feacc6868b" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.963073 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2c4ef3-c9dd-497a-b092-d257ed4ef992" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.963086 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f1d157-db58-4392-911c-344fcc5a8ce1" containerName="mariadb-account-create-update" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.963101 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfbba1f-9f1d-4994-b831-e6fd0d7d9826" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.963121 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20b8e31-ab07-411d-844b-f69077acbe95" containerName="mariadb-database-create" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.963856 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.965716 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.965881 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n8kd2" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.967604 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 07:13:50 crc kubenswrapper[4789]: I1216 07:13:50.973091 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2ln4f"] Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.099898 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vjl\" (UniqueName: \"kubernetes.io/projected/e72bc32c-5282-4477-9bc0-450e94561956-kube-api-access-p2vjl\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.099972 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-config-data\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.100189 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-scripts\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.101020 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.202610 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.202749 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vjl\" (UniqueName: \"kubernetes.io/projected/e72bc32c-5282-4477-9bc0-450e94561956-kube-api-access-p2vjl\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.202778 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-config-data\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.202871 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-scripts\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.210062 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.210114 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-config-data\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.210291 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-scripts\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.229514 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vjl\" (UniqueName: \"kubernetes.io/projected/e72bc32c-5282-4477-9bc0-450e94561956-kube-api-access-p2vjl\") pod \"nova-cell0-conductor-db-sync-2ln4f\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.279353 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.772693 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2ln4f"] Dec 16 07:13:51 crc kubenswrapper[4789]: W1216 07:13:51.776748 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode72bc32c_5282_4477_9bc0_450e94561956.slice/crio-2480b1ebe4ab9b478588172bbd3d3e84150a73627dab7889efcb79d3b28d2e23 WatchSource:0}: Error finding container 2480b1ebe4ab9b478588172bbd3d3e84150a73627dab7889efcb79d3b28d2e23: Status 404 returned error can't find the container with id 2480b1ebe4ab9b478588172bbd3d3e84150a73627dab7889efcb79d3b28d2e23 Dec 16 07:13:51 crc kubenswrapper[4789]: I1216 07:13:51.931508 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" event={"ID":"e72bc32c-5282-4477-9bc0-450e94561956","Type":"ContainerStarted","Data":"2480b1ebe4ab9b478588172bbd3d3e84150a73627dab7889efcb79d3b28d2e23"} Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.151024 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.152165 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.159124 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.407840 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.543817 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcxnl\" (UniqueName: \"kubernetes.io/projected/10d05e27-889a-4225-b194-771ccd67b38c-kube-api-access-rcxnl\") pod \"10d05e27-889a-4225-b194-771ccd67b38c\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.543992 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-sg-core-conf-yaml\") pod \"10d05e27-889a-4225-b194-771ccd67b38c\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.544022 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-combined-ca-bundle\") pod \"10d05e27-889a-4225-b194-771ccd67b38c\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.544076 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-config-data\") pod \"10d05e27-889a-4225-b194-771ccd67b38c\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.544165 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-log-httpd\") pod \"10d05e27-889a-4225-b194-771ccd67b38c\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.544204 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-scripts\") pod \"10d05e27-889a-4225-b194-771ccd67b38c\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.544233 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-run-httpd\") pod \"10d05e27-889a-4225-b194-771ccd67b38c\" (UID: \"10d05e27-889a-4225-b194-771ccd67b38c\") " Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.544612 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "10d05e27-889a-4225-b194-771ccd67b38c" (UID: "10d05e27-889a-4225-b194-771ccd67b38c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.544630 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "10d05e27-889a-4225-b194-771ccd67b38c" (UID: "10d05e27-889a-4225-b194-771ccd67b38c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.550292 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-scripts" (OuterVolumeSpecName: "scripts") pod "10d05e27-889a-4225-b194-771ccd67b38c" (UID: "10d05e27-889a-4225-b194-771ccd67b38c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.552682 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d05e27-889a-4225-b194-771ccd67b38c-kube-api-access-rcxnl" (OuterVolumeSpecName: "kube-api-access-rcxnl") pod "10d05e27-889a-4225-b194-771ccd67b38c" (UID: "10d05e27-889a-4225-b194-771ccd67b38c"). InnerVolumeSpecName "kube-api-access-rcxnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.601829 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "10d05e27-889a-4225-b194-771ccd67b38c" (UID: "10d05e27-889a-4225-b194-771ccd67b38c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.646824 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.646852 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.646860 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d05e27-889a-4225-b194-771ccd67b38c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.646869 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcxnl\" (UniqueName: \"kubernetes.io/projected/10d05e27-889a-4225-b194-771ccd67b38c-kube-api-access-rcxnl\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.646881 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.654111 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-config-data" (OuterVolumeSpecName: "config-data") pod "10d05e27-889a-4225-b194-771ccd67b38c" (UID: "10d05e27-889a-4225-b194-771ccd67b38c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.664799 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10d05e27-889a-4225-b194-771ccd67b38c" (UID: "10d05e27-889a-4225-b194-771ccd67b38c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.748260 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.748439 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d05e27-889a-4225-b194-771ccd67b38c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.950724 4789 generic.go:334] "Generic (PLEG): container finished" podID="10d05e27-889a-4225-b194-771ccd67b38c" containerID="b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6" exitCode=0 Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.950895 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.950904 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerDied","Data":"b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6"} Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.950988 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d05e27-889a-4225-b194-771ccd67b38c","Type":"ContainerDied","Data":"862a597d36230daa058271baa6aa6fed90b6f780600ee2c00d3ede1a36f28a6e"} Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.951012 4789 scope.go:117] "RemoveContainer" containerID="39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337" Dec 16 07:13:52 crc kubenswrapper[4789]: I1216 07:13:52.996304 4789 scope.go:117] "RemoveContainer" containerID="25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.011388 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.028114 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.039786 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.040306 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-notification-agent" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040321 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-notification-agent" Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.040358 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="proxy-httpd" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040365 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="proxy-httpd" Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.040376 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-central-agent" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040385 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-central-agent" Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.040396 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="sg-core" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040404 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="sg-core" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040617 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="proxy-httpd" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040632 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="sg-core" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040648 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-central-agent" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.040662 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d05e27-889a-4225-b194-771ccd67b38c" containerName="ceilometer-notification-agent" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.042577 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.047666 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.047796 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.051723 4789 scope.go:117] "RemoveContainer" containerID="759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.053770 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.104282 4789 scope.go:117] "RemoveContainer" containerID="b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.128340 4789 scope.go:117] "RemoveContainer" containerID="39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337" Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.129024 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337\": container with ID starting with 39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337 not found: ID does not exist" containerID="39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.129230 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337"} err="failed to get container status \"39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337\": rpc error: code = NotFound desc = could not find container \"39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337\": container with ID starting with 39470b35fb6d8fa80929f2fa8bed7da310f3eb0078cecb9d9e36fef5dbbc1337 not found: ID does not exist" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.129354 4789 scope.go:117] "RemoveContainer" containerID="25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f" Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.130251 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f\": container with ID starting with 25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f not found: ID does not exist" containerID="25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.130289 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f"} err="failed to get container status \"25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f\": rpc error: code = NotFound desc = could not find container \"25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f\": container with ID starting with 25b30cb50d76d505f2ac525df927f32ba40c2ba5829b11d9a7ca49977b99db6f not found: ID does not exist" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.130313 4789 scope.go:117] "RemoveContainer" containerID="759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada" Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.130835 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada\": container with ID starting with 759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada not found: ID does not exist" containerID="759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.130861 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada"} err="failed to get container status \"759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada\": rpc error: code = NotFound desc = could not find container \"759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada\": container with ID starting with 759be8997dca9df9b07945aa31fce9e84cb91d14c5556859dc328897403d9ada not found: ID does not exist" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.130876 4789 scope.go:117] "RemoveContainer" containerID="b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6" Dec 16 07:13:53 crc kubenswrapper[4789]: E1216 07:13:53.131467 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6\": container with ID starting with b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6 not found: ID does not exist" containerID="b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.131490 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6"} err="failed to get container status \"b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6\": rpc error: code = NotFound desc = could not find container \"b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6\": container with ID starting with b00e7b10d3cbe604b819515ddaa88ff8a98359bb4d78abc96fc094d74887f0c6 not found: ID does not exist" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.156513 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.156615 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pnlj\" (UniqueName: \"kubernetes.io/projected/829d8064-b7b4-43d1-99b4-258f3854c26f-kube-api-access-4pnlj\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.156719 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.157424 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-log-httpd\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.157474 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-scripts\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.157491 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-run-httpd\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.157539 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-config-data\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.259705 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-log-httpd\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.259759 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-scripts\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.259774 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-run-httpd\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.259795 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-config-data\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.259939 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.259975 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pnlj\" (UniqueName: \"kubernetes.io/projected/829d8064-b7b4-43d1-99b4-258f3854c26f-kube-api-access-4pnlj\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.260049 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.260905 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-log-httpd\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.260998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-run-httpd\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.267306 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-config-data\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.267798 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-scripts\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.269631 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.279527 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.280714 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pnlj\" (UniqueName: \"kubernetes.io/projected/829d8064-b7b4-43d1-99b4-258f3854c26f-kube-api-access-4pnlj\") pod \"ceilometer-0\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.364182 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.538868 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.539427 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.625269 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.884496 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:53 crc kubenswrapper[4789]: W1216 07:13:53.887369 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod829d8064_b7b4_43d1_99b4_258f3854c26f.slice/crio-a3198a2f3081b33c9f9c7ae53570d2efa227aa7fcd206034e4a38a9e04dc3a0f WatchSource:0}: Error finding container a3198a2f3081b33c9f9c7ae53570d2efa227aa7fcd206034e4a38a9e04dc3a0f: Status 404 returned error can't find the container with id a3198a2f3081b33c9f9c7ae53570d2efa227aa7fcd206034e4a38a9e04dc3a0f Dec 16 07:13:53 crc kubenswrapper[4789]: I1216 07:13:53.969390 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerStarted","Data":"a3198a2f3081b33c9f9c7ae53570d2efa227aa7fcd206034e4a38a9e04dc3a0f"} Dec 16 07:13:54 crc kubenswrapper[4789]: I1216 07:13:54.146134 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d05e27-889a-4225-b194-771ccd67b38c" path="/var/lib/kubelet/pods/10d05e27-889a-4225-b194-771ccd67b38c/volumes" Dec 16 07:13:55 crc kubenswrapper[4789]: I1216 07:13:55.091226 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:13:56 crc kubenswrapper[4789]: I1216 07:13:56.000833 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerStarted","Data":"fa5e57b2c7738008621c1c280d629414cfab825de265493d8b990ed1c695911c"} Dec 16 07:14:01 crc kubenswrapper[4789]: I1216 07:14:01.046659 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" event={"ID":"e72bc32c-5282-4477-9bc0-450e94561956","Type":"ContainerStarted","Data":"f04069955550582d7569e8c1e11f96772b8e1ef3da29e68d4a5c4e6db554a44f"} Dec 16 07:14:01 crc kubenswrapper[4789]: I1216 07:14:01.049973 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerStarted","Data":"1af13a7ab140f5e791962f5f43fe7435725bea39a70bc88bacd5b6419f1058dc"} Dec 16 07:14:01 crc kubenswrapper[4789]: I1216 07:14:01.062248 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" podStartSLOduration=2.504982906 podStartE2EDuration="11.062230023s" podCreationTimestamp="2025-12-16 07:13:50 +0000 UTC" firstStartedPulling="2025-12-16 07:13:51.781635627 +0000 UTC m=+1370.043523256" lastFinishedPulling="2025-12-16 07:14:00.338882744 +0000 UTC m=+1378.600770373" observedRunningTime="2025-12-16 07:14:01.059522757 +0000 UTC m=+1379.321410386" watchObservedRunningTime="2025-12-16 07:14:01.062230023 +0000 UTC m=+1379.324117652" Dec 16 07:14:02 crc kubenswrapper[4789]: I1216 07:14:02.062171 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerStarted","Data":"35289860942cc6db9f2edb94b7cc43e9959df1ea5998df5ebace70f6fd0feeee"} Dec 16 07:14:04 crc kubenswrapper[4789]: I1216 07:14:04.083517 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerStarted","Data":"5c4f17f77fc75eb221589e497e460a1b58396a1506628a1fb2a3e28d3149eeb8"} Dec 16 07:14:04 crc kubenswrapper[4789]: I1216 07:14:04.083705 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-central-agent" containerID="cri-o://fa5e57b2c7738008621c1c280d629414cfab825de265493d8b990ed1c695911c" gracePeriod=30 Dec 16 07:14:04 crc kubenswrapper[4789]: I1216 07:14:04.083937 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-notification-agent" containerID="cri-o://1af13a7ab140f5e791962f5f43fe7435725bea39a70bc88bacd5b6419f1058dc" gracePeriod=30 Dec 16 07:14:04 crc kubenswrapper[4789]: I1216 07:14:04.083943 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="sg-core" containerID="cri-o://35289860942cc6db9f2edb94b7cc43e9959df1ea5998df5ebace70f6fd0feeee" gracePeriod=30 Dec 16 07:14:04 crc kubenswrapper[4789]: I1216 07:14:04.084000 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:14:04 crc kubenswrapper[4789]: I1216 07:14:04.084002 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="proxy-httpd" containerID="cri-o://5c4f17f77fc75eb221589e497e460a1b58396a1506628a1fb2a3e28d3149eeb8" gracePeriod=30 Dec 16 07:14:04 crc kubenswrapper[4789]: I1216 07:14:04.122371 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.060284665 podStartE2EDuration="12.122352785s" podCreationTimestamp="2025-12-16 07:13:52 +0000 UTC" firstStartedPulling="2025-12-16 07:13:53.895180894 +0000 UTC m=+1372.157068523" lastFinishedPulling="2025-12-16 07:14:02.957249014 +0000 UTC m=+1381.219136643" observedRunningTime="2025-12-16 07:14:04.116682427 +0000 UTC m=+1382.378570056" watchObservedRunningTime="2025-12-16 07:14:04.122352785 +0000 UTC m=+1382.384240414" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.105850 4789 generic.go:334] "Generic (PLEG): container finished" podID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerID="5c4f17f77fc75eb221589e497e460a1b58396a1506628a1fb2a3e28d3149eeb8" exitCode=0 Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.106511 4789 generic.go:334] "Generic (PLEG): container finished" podID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerID="35289860942cc6db9f2edb94b7cc43e9959df1ea5998df5ebace70f6fd0feeee" exitCode=2 Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.106524 4789 generic.go:334] "Generic (PLEG): container finished" podID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerID="1af13a7ab140f5e791962f5f43fe7435725bea39a70bc88bacd5b6419f1058dc" exitCode=0 Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.106532 4789 generic.go:334] "Generic (PLEG): container finished" podID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerID="fa5e57b2c7738008621c1c280d629414cfab825de265493d8b990ed1c695911c" exitCode=0 Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.106009 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerDied","Data":"5c4f17f77fc75eb221589e497e460a1b58396a1506628a1fb2a3e28d3149eeb8"} Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.106573 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerDied","Data":"35289860942cc6db9f2edb94b7cc43e9959df1ea5998df5ebace70f6fd0feeee"} Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.106588 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerDied","Data":"1af13a7ab140f5e791962f5f43fe7435725bea39a70bc88bacd5b6419f1058dc"} Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.106598 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerDied","Data":"fa5e57b2c7738008621c1c280d629414cfab825de265493d8b990ed1c695911c"} Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.261431 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.402840 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-run-httpd\") pod \"829d8064-b7b4-43d1-99b4-258f3854c26f\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.402900 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pnlj\" (UniqueName: \"kubernetes.io/projected/829d8064-b7b4-43d1-99b4-258f3854c26f-kube-api-access-4pnlj\") pod \"829d8064-b7b4-43d1-99b4-258f3854c26f\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.402951 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-config-data\") pod \"829d8064-b7b4-43d1-99b4-258f3854c26f\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.402983 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-sg-core-conf-yaml\") pod \"829d8064-b7b4-43d1-99b4-258f3854c26f\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.403032 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-scripts\") pod \"829d8064-b7b4-43d1-99b4-258f3854c26f\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.403058 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-log-httpd\") pod \"829d8064-b7b4-43d1-99b4-258f3854c26f\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.403076 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-combined-ca-bundle\") pod \"829d8064-b7b4-43d1-99b4-258f3854c26f\" (UID: \"829d8064-b7b4-43d1-99b4-258f3854c26f\") " Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.403687 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "829d8064-b7b4-43d1-99b4-258f3854c26f" (UID: "829d8064-b7b4-43d1-99b4-258f3854c26f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.404283 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "829d8064-b7b4-43d1-99b4-258f3854c26f" (UID: "829d8064-b7b4-43d1-99b4-258f3854c26f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.409200 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829d8064-b7b4-43d1-99b4-258f3854c26f-kube-api-access-4pnlj" (OuterVolumeSpecName: "kube-api-access-4pnlj") pod "829d8064-b7b4-43d1-99b4-258f3854c26f" (UID: "829d8064-b7b4-43d1-99b4-258f3854c26f"). InnerVolumeSpecName "kube-api-access-4pnlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.410878 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-scripts" (OuterVolumeSpecName: "scripts") pod "829d8064-b7b4-43d1-99b4-258f3854c26f" (UID: "829d8064-b7b4-43d1-99b4-258f3854c26f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.431977 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "829d8064-b7b4-43d1-99b4-258f3854c26f" (UID: "829d8064-b7b4-43d1-99b4-258f3854c26f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.488176 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "829d8064-b7b4-43d1-99b4-258f3854c26f" (UID: "829d8064-b7b4-43d1-99b4-258f3854c26f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.495807 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-config-data" (OuterVolumeSpecName: "config-data") pod "829d8064-b7b4-43d1-99b4-258f3854c26f" (UID: "829d8064-b7b4-43d1-99b4-258f3854c26f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.505610 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.505651 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pnlj\" (UniqueName: \"kubernetes.io/projected/829d8064-b7b4-43d1-99b4-258f3854c26f-kube-api-access-4pnlj\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.505664 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.505674 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.505685 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.505696 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/829d8064-b7b4-43d1-99b4-258f3854c26f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:05 crc kubenswrapper[4789]: I1216 07:14:05.505706 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829d8064-b7b4-43d1-99b4-258f3854c26f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.117689 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.123865 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"829d8064-b7b4-43d1-99b4-258f3854c26f","Type":"ContainerDied","Data":"a3198a2f3081b33c9f9c7ae53570d2efa227aa7fcd206034e4a38a9e04dc3a0f"} Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.123942 4789 scope.go:117] "RemoveContainer" containerID="5c4f17f77fc75eb221589e497e460a1b58396a1506628a1fb2a3e28d3149eeb8" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.148822 4789 scope.go:117] "RemoveContainer" containerID="35289860942cc6db9f2edb94b7cc43e9959df1ea5998df5ebace70f6fd0feeee" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.171834 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.175594 4789 scope.go:117] "RemoveContainer" containerID="1af13a7ab140f5e791962f5f43fe7435725bea39a70bc88bacd5b6419f1058dc" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.203965 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.229057 4789 scope.go:117] "RemoveContainer" containerID="fa5e57b2c7738008621c1c280d629414cfab825de265493d8b990ed1c695911c" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.270898 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:06 crc kubenswrapper[4789]: E1216 07:14:06.271795 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-notification-agent" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.280934 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-notification-agent" Dec 16 07:14:06 crc kubenswrapper[4789]: E1216 07:14:06.280995 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="sg-core" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.281013 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="sg-core" Dec 16 07:14:06 crc kubenswrapper[4789]: E1216 07:14:06.281121 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-central-agent" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.281129 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-central-agent" Dec 16 07:14:06 crc kubenswrapper[4789]: E1216 07:14:06.281138 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="proxy-httpd" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.281144 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="proxy-httpd" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.281793 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="sg-core" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.281837 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-central-agent" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.281868 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="proxy-httpd" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.281879 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" containerName="ceilometer-notification-agent" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.331402 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.331546 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.337467 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.337792 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.445886 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-run-httpd\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.446726 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl2n2\" (UniqueName: \"kubernetes.io/projected/787226ff-bf02-4615-91b7-e5aa06525027-kube-api-access-vl2n2\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.447019 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-config-data\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.447099 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-scripts\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.447160 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.447222 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-log-httpd\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.447240 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549249 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-run-httpd\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549345 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl2n2\" (UniqueName: \"kubernetes.io/projected/787226ff-bf02-4615-91b7-e5aa06525027-kube-api-access-vl2n2\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549398 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-config-data\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549548 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-scripts\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549649 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549694 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-log-httpd\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549717 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.549725 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-run-httpd\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.550109 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-log-httpd\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.554360 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.554545 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.554856 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-scripts\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.563605 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-config-data\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.565815 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl2n2\" (UniqueName: \"kubernetes.io/projected/787226ff-bf02-4615-91b7-e5aa06525027-kube-api-access-vl2n2\") pod \"ceilometer-0\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " pod="openstack/ceilometer-0" Dec 16 07:14:06 crc kubenswrapper[4789]: I1216 07:14:06.675636 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:07 crc kubenswrapper[4789]: I1216 07:14:07.126750 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:08 crc kubenswrapper[4789]: I1216 07:14:08.118399 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829d8064-b7b4-43d1-99b4-258f3854c26f" path="/var/lib/kubelet/pods/829d8064-b7b4-43d1-99b4-258f3854c26f/volumes" Dec 16 07:14:08 crc kubenswrapper[4789]: I1216 07:14:08.148456 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerStarted","Data":"67f22bbd18a94287a5f91c742e4faf9db04658d3f364ae8a05be53c49e2bddd2"} Dec 16 07:14:10 crc kubenswrapper[4789]: I1216 07:14:10.166087 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerStarted","Data":"b79cb583fbbb85620150b995e02876362ccfffacd2f652db343ed7fdf4bbd7ea"} Dec 16 07:14:11 crc kubenswrapper[4789]: I1216 07:14:11.176499 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerStarted","Data":"f65d9f5c61e599c970b9cdce1149af994d2d92774c51a54cfeb707abc5e630e6"} Dec 16 07:14:12 crc kubenswrapper[4789]: I1216 07:14:12.188560 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerStarted","Data":"6caaba2e9caa91981b19f656daa04c6280bee014350ca8ad8aaae1e9f5256994"} Dec 16 07:14:13 crc kubenswrapper[4789]: I1216 07:14:13.198847 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerStarted","Data":"475e135f8166ce02f93701cd667929e5a6db20de00a10020d3974c0a405a443c"} Dec 16 07:14:13 crc kubenswrapper[4789]: I1216 07:14:13.199728 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:14:13 crc kubenswrapper[4789]: I1216 07:14:13.227983 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.824914989 podStartE2EDuration="7.22796032s" podCreationTimestamp="2025-12-16 07:14:06 +0000 UTC" firstStartedPulling="2025-12-16 07:14:07.127998681 +0000 UTC m=+1385.389886310" lastFinishedPulling="2025-12-16 07:14:12.531044012 +0000 UTC m=+1390.792931641" observedRunningTime="2025-12-16 07:14:13.217878156 +0000 UTC m=+1391.479765805" watchObservedRunningTime="2025-12-16 07:14:13.22796032 +0000 UTC m=+1391.489847949" Dec 16 07:14:14 crc kubenswrapper[4789]: I1216 07:14:14.209787 4789 generic.go:334] "Generic (PLEG): container finished" podID="e72bc32c-5282-4477-9bc0-450e94561956" containerID="f04069955550582d7569e8c1e11f96772b8e1ef3da29e68d4a5c4e6db554a44f" exitCode=0 Dec 16 07:14:14 crc kubenswrapper[4789]: I1216 07:14:14.209947 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" event={"ID":"e72bc32c-5282-4477-9bc0-450e94561956","Type":"ContainerDied","Data":"f04069955550582d7569e8c1e11f96772b8e1ef3da29e68d4a5c4e6db554a44f"} Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.562050 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.643716 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-combined-ca-bundle\") pod \"e72bc32c-5282-4477-9bc0-450e94561956\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.644008 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vjl\" (UniqueName: \"kubernetes.io/projected/e72bc32c-5282-4477-9bc0-450e94561956-kube-api-access-p2vjl\") pod \"e72bc32c-5282-4477-9bc0-450e94561956\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.644069 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-scripts\") pod \"e72bc32c-5282-4477-9bc0-450e94561956\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.644157 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-config-data\") pod \"e72bc32c-5282-4477-9bc0-450e94561956\" (UID: \"e72bc32c-5282-4477-9bc0-450e94561956\") " Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.649682 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-scripts" (OuterVolumeSpecName: "scripts") pod "e72bc32c-5282-4477-9bc0-450e94561956" (UID: "e72bc32c-5282-4477-9bc0-450e94561956"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.663251 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e72bc32c-5282-4477-9bc0-450e94561956-kube-api-access-p2vjl" (OuterVolumeSpecName: "kube-api-access-p2vjl") pod "e72bc32c-5282-4477-9bc0-450e94561956" (UID: "e72bc32c-5282-4477-9bc0-450e94561956"). InnerVolumeSpecName "kube-api-access-p2vjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.668963 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-config-data" (OuterVolumeSpecName: "config-data") pod "e72bc32c-5282-4477-9bc0-450e94561956" (UID: "e72bc32c-5282-4477-9bc0-450e94561956"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.674376 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e72bc32c-5282-4477-9bc0-450e94561956" (UID: "e72bc32c-5282-4477-9bc0-450e94561956"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.749310 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vjl\" (UniqueName: \"kubernetes.io/projected/e72bc32c-5282-4477-9bc0-450e94561956-kube-api-access-p2vjl\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.749340 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.749349 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:15 crc kubenswrapper[4789]: I1216 07:14:15.749358 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e72bc32c-5282-4477-9bc0-450e94561956-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.229785 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" event={"ID":"e72bc32c-5282-4477-9bc0-450e94561956","Type":"ContainerDied","Data":"2480b1ebe4ab9b478588172bbd3d3e84150a73627dab7889efcb79d3b28d2e23"} Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.229833 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2480b1ebe4ab9b478588172bbd3d3e84150a73627dab7889efcb79d3b28d2e23" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.229856 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2ln4f" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.332186 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:14:16 crc kubenswrapper[4789]: E1216 07:14:16.332694 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e72bc32c-5282-4477-9bc0-450e94561956" containerName="nova-cell0-conductor-db-sync" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.332717 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e72bc32c-5282-4477-9bc0-450e94561956" containerName="nova-cell0-conductor-db-sync" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.332931 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e72bc32c-5282-4477-9bc0-450e94561956" containerName="nova-cell0-conductor-db-sync" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.333724 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.340059 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n8kd2" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.340272 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.342116 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.367424 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsgtr\" (UniqueName: \"kubernetes.io/projected/358d8958-a563-407c-8b7f-75aee19a3699-kube-api-access-tsgtr\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.367549 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.367599 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.470050 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgtr\" (UniqueName: \"kubernetes.io/projected/358d8958-a563-407c-8b7f-75aee19a3699-kube-api-access-tsgtr\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.470127 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.470170 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.475333 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.480541 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.486905 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsgtr\" (UniqueName: \"kubernetes.io/projected/358d8958-a563-407c-8b7f-75aee19a3699-kube-api-access-tsgtr\") pod \"nova-cell0-conductor-0\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:16 crc kubenswrapper[4789]: I1216 07:14:16.649721 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:17 crc kubenswrapper[4789]: I1216 07:14:17.099547 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:14:17 crc kubenswrapper[4789]: W1216 07:14:17.112455 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod358d8958_a563_407c_8b7f_75aee19a3699.slice/crio-3d928f0f544dc5675e5d11d3ec179405a3a23b2e72e60efe823b2788d8fbe6ad WatchSource:0}: Error finding container 3d928f0f544dc5675e5d11d3ec179405a3a23b2e72e60efe823b2788d8fbe6ad: Status 404 returned error can't find the container with id 3d928f0f544dc5675e5d11d3ec179405a3a23b2e72e60efe823b2788d8fbe6ad Dec 16 07:14:17 crc kubenswrapper[4789]: I1216 07:14:17.248555 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"358d8958-a563-407c-8b7f-75aee19a3699","Type":"ContainerStarted","Data":"3d928f0f544dc5675e5d11d3ec179405a3a23b2e72e60efe823b2788d8fbe6ad"} Dec 16 07:14:18 crc kubenswrapper[4789]: I1216 07:14:18.259832 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"358d8958-a563-407c-8b7f-75aee19a3699","Type":"ContainerStarted","Data":"7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c"} Dec 16 07:14:18 crc kubenswrapper[4789]: I1216 07:14:18.261338 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:18 crc kubenswrapper[4789]: I1216 07:14:18.289524 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.289485855 podStartE2EDuration="2.289485855s" podCreationTimestamp="2025-12-16 07:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:18.280891517 +0000 UTC m=+1396.542779146" watchObservedRunningTime="2025-12-16 07:14:18.289485855 +0000 UTC m=+1396.551373484" Dec 16 07:14:26 crc kubenswrapper[4789]: I1216 07:14:26.677750 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.104695 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4kpsb"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.106167 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.107941 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.108935 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.117538 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4kpsb"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.161846 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-scripts\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.161954 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzhzf\" (UniqueName: \"kubernetes.io/projected/f73fa10b-54a6-4292-be91-84657f4a43cd-kube-api-access-dzhzf\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.162006 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-config-data\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.162039 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.263208 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-scripts\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.263310 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzhzf\" (UniqueName: \"kubernetes.io/projected/f73fa10b-54a6-4292-be91-84657f4a43cd-kube-api-access-dzhzf\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.263360 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-config-data\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.263391 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.271090 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.271169 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-config-data\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.271674 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-scripts\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.300194 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.302535 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.304202 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzhzf\" (UniqueName: \"kubernetes.io/projected/f73fa10b-54a6-4292-be91-84657f4a43cd-kube-api-access-dzhzf\") pod \"nova-cell0-cell-mapping-4kpsb\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.308476 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.320238 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.365410 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.365464 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh6s6\" (UniqueName: \"kubernetes.io/projected/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-kube-api-access-nh6s6\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.365510 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-config-data\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.365551 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-logs\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.399524 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.404260 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.406937 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.417724 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.465104 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.466233 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.470244 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.475170 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.477871 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.478318 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzhxw\" (UniqueName: \"kubernetes.io/projected/cae1f599-5496-4a3f-8f82-b77d9923f9aa-kube-api-access-xzhxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.478468 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.478585 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh6s6\" (UniqueName: \"kubernetes.io/projected/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-kube-api-access-nh6s6\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.478711 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.478864 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-config-data\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.479048 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-logs\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.479887 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-logs\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.482023 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.483700 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-config-data\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.483798 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.519303 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh6s6\" (UniqueName: \"kubernetes.io/projected/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-kube-api-access-nh6s6\") pod \"nova-api-0\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.559537 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.566393 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.568952 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.580363 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzhxw\" (UniqueName: \"kubernetes.io/projected/cae1f599-5496-4a3f-8f82-b77d9923f9aa-kube-api-access-xzhxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.580416 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.580474 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.597440 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.601543 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.608236 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.618604 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzhxw\" (UniqueName: \"kubernetes.io/projected/cae1f599-5496-4a3f-8f82-b77d9923f9aa-kube-api-access-xzhxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.683250 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.683294 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-config-data\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.683332 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpm5j\" (UniqueName: \"kubernetes.io/projected/c7666b80-e60e-4038-9921-53415db91cef-kube-api-access-wpm5j\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.683376 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.683428 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9c9l\" (UniqueName: \"kubernetes.io/projected/631d054c-8489-442b-915e-edf3f9a3b904-kube-api-access-p9c9l\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.683462 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7666b80-e60e-4038-9921-53415db91cef-logs\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.683485 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.697437 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.745587 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.786055 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.786100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-config-data\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.786125 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpm5j\" (UniqueName: \"kubernetes.io/projected/c7666b80-e60e-4038-9921-53415db91cef-kube-api-access-wpm5j\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.786157 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.786197 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9c9l\" (UniqueName: \"kubernetes.io/projected/631d054c-8489-442b-915e-edf3f9a3b904-kube-api-access-p9c9l\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.786223 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7666b80-e60e-4038-9921-53415db91cef-logs\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.786250 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.792223 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.793186 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7666b80-e60e-4038-9921-53415db91cef-logs\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.796524 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-config-data\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.804775 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.815695 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.904746 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpm5j\" (UniqueName: \"kubernetes.io/projected/c7666b80-e60e-4038-9921-53415db91cef-kube-api-access-wpm5j\") pod \"nova-metadata-0\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " pod="openstack/nova-metadata-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.908589 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9c9l\" (UniqueName: \"kubernetes.io/projected/631d054c-8489-442b-915e-edf3f9a3b904-kube-api-access-p9c9l\") pod \"nova-scheduler-0\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.926421 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:14:27 crc kubenswrapper[4789]: I1216 07:14:27.975526 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.089820 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-4g6jl"] Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.093417 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.152209 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-4g6jl"] Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.172456 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4kpsb"] Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.198648 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.198712 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-config\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.198819 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7xv\" (UniqueName: \"kubernetes.io/projected/d2f16298-806a-4dfb-a320-96f52dfeeb6e-kube-api-access-vl7xv\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.198848 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.198921 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.198951 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: W1216 07:14:28.206980 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf73fa10b_54a6_4292_be91_84657f4a43cd.slice/crio-7d6e854b84daa13948281bc3da03429976c5354db524a03bf71160cd9d6ba11c WatchSource:0}: Error finding container 7d6e854b84daa13948281bc3da03429976c5354db524a03bf71160cd9d6ba11c: Status 404 returned error can't find the container with id 7d6e854b84daa13948281bc3da03429976c5354db524a03bf71160cd9d6ba11c Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.307369 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.307452 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.307494 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-config\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.308791 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-svc\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.308881 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-swift-storage-0\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.325815 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-config\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.327041 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7xv\" (UniqueName: \"kubernetes.io/projected/d2f16298-806a-4dfb-a320-96f52dfeeb6e-kube-api-access-vl7xv\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.327081 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.327123 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.327772 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-sb\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.328717 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-nb\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.356607 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7xv\" (UniqueName: \"kubernetes.io/projected/d2f16298-806a-4dfb-a320-96f52dfeeb6e-kube-api-access-vl7xv\") pod \"dnsmasq-dns-647df7b8c5-4g6jl\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.395296 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4kpsb" event={"ID":"f73fa10b-54a6-4292-be91-84657f4a43cd","Type":"ContainerStarted","Data":"7d6e854b84daa13948281bc3da03429976c5354db524a03bf71160cd9d6ba11c"} Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.471876 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.486898 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.610424 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:14:28 crc kubenswrapper[4789]: W1216 07:14:28.620290 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcae1f599_5496_4a3f_8f82_b77d9923f9aa.slice/crio-f2a72b260fb2346bd7692140de84b1b16d5ac168474da22e73e83620f5f00e2c WatchSource:0}: Error finding container f2a72b260fb2346bd7692140de84b1b16d5ac168474da22e73e83620f5f00e2c: Status 404 returned error can't find the container with id f2a72b260fb2346bd7692140de84b1b16d5ac168474da22e73e83620f5f00e2c Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.722873 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:28 crc kubenswrapper[4789]: W1216 07:14:28.734726 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7666b80_e60e_4038_9921_53415db91cef.slice/crio-76f22d7ab0d9196d79388a8444646b1873644d2cacbd06a9c15da48bcc8b1a57 WatchSource:0}: Error finding container 76f22d7ab0d9196d79388a8444646b1873644d2cacbd06a9c15da48bcc8b1a57: Status 404 returned error can't find the container with id 76f22d7ab0d9196d79388a8444646b1873644d2cacbd06a9c15da48bcc8b1a57 Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.740101 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xtjrv"] Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.741234 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: W1216 07:14:28.748318 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod631d054c_8489_442b_915e_edf3f9a3b904.slice/crio-b14b4ad6893468dbef46c4e27015c7a55e9f11ec880935efb6e1a7d662985be3 WatchSource:0}: Error finding container b14b4ad6893468dbef46c4e27015c7a55e9f11ec880935efb6e1a7d662985be3: Status 404 returned error can't find the container with id b14b4ad6893468dbef46c4e27015c7a55e9f11ec880935efb6e1a7d662985be3 Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.748676 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.748876 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.759303 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.775555 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xtjrv"] Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.841610 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.841835 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-config-data\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.841987 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrv4\" (UniqueName: \"kubernetes.io/projected/e48c05ec-30ab-4ea1-a542-35bf74481375-kube-api-access-qtrv4\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.842033 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-scripts\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.944015 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-scripts\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.944099 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.944203 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-config-data\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.944305 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrv4\" (UniqueName: \"kubernetes.io/projected/e48c05ec-30ab-4ea1-a542-35bf74481375-kube-api-access-qtrv4\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.952996 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.953039 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-config-data\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.957250 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-scripts\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.971624 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrv4\" (UniqueName: \"kubernetes.io/projected/e48c05ec-30ab-4ea1-a542-35bf74481375-kube-api-access-qtrv4\") pod \"nova-cell1-conductor-db-sync-xtjrv\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:28 crc kubenswrapper[4789]: I1216 07:14:28.973950 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-4g6jl"] Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.092072 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.411654 4789 generic.go:334] "Generic (PLEG): container finished" podID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerID="cbfb2e0ba2710519517623fa41df119a7254c846ae45bffd7f3247161de26dbd" exitCode=0 Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.412162 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" event={"ID":"d2f16298-806a-4dfb-a320-96f52dfeeb6e","Type":"ContainerDied","Data":"cbfb2e0ba2710519517623fa41df119a7254c846ae45bffd7f3247161de26dbd"} Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.412197 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" event={"ID":"d2f16298-806a-4dfb-a320-96f52dfeeb6e","Type":"ContainerStarted","Data":"750543b9a7a7a78602f9f5882aacc35ea7e73948b7cc7771b456a60032b284fe"} Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.413818 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"631d054c-8489-442b-915e-edf3f9a3b904","Type":"ContainerStarted","Data":"b14b4ad6893468dbef46c4e27015c7a55e9f11ec880935efb6e1a7d662985be3"} Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.445058 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4kpsb" event={"ID":"f73fa10b-54a6-4292-be91-84657f4a43cd","Type":"ContainerStarted","Data":"5363a9b917a24482d684b0616aeb1c17a20bcb0aac7b8fa2460fcb0158ec6b7d"} Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.447967 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cae1f599-5496-4a3f-8f82-b77d9923f9aa","Type":"ContainerStarted","Data":"f2a72b260fb2346bd7692140de84b1b16d5ac168474da22e73e83620f5f00e2c"} Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.454699 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7666b80-e60e-4038-9921-53415db91cef","Type":"ContainerStarted","Data":"76f22d7ab0d9196d79388a8444646b1873644d2cacbd06a9c15da48bcc8b1a57"} Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.467549 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4kpsb" podStartSLOduration=2.467519121 podStartE2EDuration="2.467519121s" podCreationTimestamp="2025-12-16 07:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:29.464506168 +0000 UTC m=+1407.726393797" watchObservedRunningTime="2025-12-16 07:14:29.467519121 +0000 UTC m=+1407.729406750" Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.468475 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dac0e455-b023-44ce-8ba6-9fd520f1e0fb","Type":"ContainerStarted","Data":"83ca8a2ebf72c5e7ab3072f13079e0dbc0ff260c7050716a11f7876caf129cb6"} Dec 16 07:14:29 crc kubenswrapper[4789]: I1216 07:14:29.614870 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xtjrv"] Dec 16 07:14:29 crc kubenswrapper[4789]: W1216 07:14:29.630072 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode48c05ec_30ab_4ea1_a542_35bf74481375.slice/crio-159c09e9fa11825b995a5ef021c8f71004f34b622507e890fb0f7ddb3785458e WatchSource:0}: Error finding container 159c09e9fa11825b995a5ef021c8f71004f34b622507e890fb0f7ddb3785458e: Status 404 returned error can't find the container with id 159c09e9fa11825b995a5ef021c8f71004f34b622507e890fb0f7ddb3785458e Dec 16 07:14:30 crc kubenswrapper[4789]: I1216 07:14:30.481141 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" event={"ID":"d2f16298-806a-4dfb-a320-96f52dfeeb6e","Type":"ContainerStarted","Data":"a9d3e2d0717a7e9587d61cdaf207fe23e29b7bf0ae3ee2517d87efb7bc8817af"} Dec 16 07:14:30 crc kubenswrapper[4789]: I1216 07:14:30.481436 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:30 crc kubenswrapper[4789]: I1216 07:14:30.487560 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" event={"ID":"e48c05ec-30ab-4ea1-a542-35bf74481375","Type":"ContainerStarted","Data":"ea367b9824a152f5e9f0d2ff9d6db99fdb1d7a70ba0b2b4e1be6b907ac9c6eb4"} Dec 16 07:14:30 crc kubenswrapper[4789]: I1216 07:14:30.487610 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" event={"ID":"e48c05ec-30ab-4ea1-a542-35bf74481375","Type":"ContainerStarted","Data":"159c09e9fa11825b995a5ef021c8f71004f34b622507e890fb0f7ddb3785458e"} Dec 16 07:14:30 crc kubenswrapper[4789]: I1216 07:14:30.505673 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" podStartSLOduration=3.505657377 podStartE2EDuration="3.505657377s" podCreationTimestamp="2025-12-16 07:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:30.502931361 +0000 UTC m=+1408.764818990" watchObservedRunningTime="2025-12-16 07:14:30.505657377 +0000 UTC m=+1408.767545006" Dec 16 07:14:30 crc kubenswrapper[4789]: I1216 07:14:30.551165 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" podStartSLOduration=2.551140122 podStartE2EDuration="2.551140122s" podCreationTimestamp="2025-12-16 07:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:30.523242745 +0000 UTC m=+1408.785130374" watchObservedRunningTime="2025-12-16 07:14:30.551140122 +0000 UTC m=+1408.813027751" Dec 16 07:14:31 crc kubenswrapper[4789]: I1216 07:14:31.652208 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:31 crc kubenswrapper[4789]: I1216 07:14:31.665441 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.523782 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-log" containerID="cri-o://88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3" gracePeriod=30 Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.524693 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7666b80-e60e-4038-9921-53415db91cef","Type":"ContainerStarted","Data":"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2"} Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.524726 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7666b80-e60e-4038-9921-53415db91cef","Type":"ContainerStarted","Data":"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3"} Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.525088 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-metadata" containerID="cri-o://a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2" gracePeriod=30 Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.540048 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dac0e455-b023-44ce-8ba6-9fd520f1e0fb","Type":"ContainerStarted","Data":"18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a"} Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.540096 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dac0e455-b023-44ce-8ba6-9fd520f1e0fb","Type":"ContainerStarted","Data":"f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6"} Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.556479 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"631d054c-8489-442b-915e-edf3f9a3b904","Type":"ContainerStarted","Data":"0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10"} Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.576060 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.626769479 podStartE2EDuration="7.576029247s" podCreationTimestamp="2025-12-16 07:14:27 +0000 UTC" firstStartedPulling="2025-12-16 07:14:28.738532064 +0000 UTC m=+1407.000419703" lastFinishedPulling="2025-12-16 07:14:33.687791842 +0000 UTC m=+1411.949679471" observedRunningTime="2025-12-16 07:14:34.546529871 +0000 UTC m=+1412.808417500" watchObservedRunningTime="2025-12-16 07:14:34.576029247 +0000 UTC m=+1412.837916876" Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.577955 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cae1f599-5496-4a3f-8f82-b77d9923f9aa","Type":"ContainerStarted","Data":"03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a"} Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.578120 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cae1f599-5496-4a3f-8f82-b77d9923f9aa" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a" gracePeriod=30 Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.586519 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.391646819 podStartE2EDuration="7.586493722s" podCreationTimestamp="2025-12-16 07:14:27 +0000 UTC" firstStartedPulling="2025-12-16 07:14:28.493109423 +0000 UTC m=+1406.754997052" lastFinishedPulling="2025-12-16 07:14:33.687956326 +0000 UTC m=+1411.949843955" observedRunningTime="2025-12-16 07:14:34.565288597 +0000 UTC m=+1412.827176226" watchObservedRunningTime="2025-12-16 07:14:34.586493722 +0000 UTC m=+1412.848381351" Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.607349 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.674887948 podStartE2EDuration="7.607328268s" podCreationTimestamp="2025-12-16 07:14:27 +0000 UTC" firstStartedPulling="2025-12-16 07:14:28.755188568 +0000 UTC m=+1407.017076187" lastFinishedPulling="2025-12-16 07:14:33.687628868 +0000 UTC m=+1411.949516507" observedRunningTime="2025-12-16 07:14:34.581023749 +0000 UTC m=+1412.842911378" watchObservedRunningTime="2025-12-16 07:14:34.607328268 +0000 UTC m=+1412.869215897" Dec 16 07:14:34 crc kubenswrapper[4789]: I1216 07:14:34.610845 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5489431380000003 podStartE2EDuration="7.610827962s" podCreationTimestamp="2025-12-16 07:14:27 +0000 UTC" firstStartedPulling="2025-12-16 07:14:28.62683769 +0000 UTC m=+1406.888725319" lastFinishedPulling="2025-12-16 07:14:33.688722514 +0000 UTC m=+1411.950610143" observedRunningTime="2025-12-16 07:14:34.608093086 +0000 UTC m=+1412.869980715" watchObservedRunningTime="2025-12-16 07:14:34.610827962 +0000 UTC m=+1412.872715591" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.526360 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.618543 4789 generic.go:334] "Generic (PLEG): container finished" podID="c7666b80-e60e-4038-9921-53415db91cef" containerID="a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2" exitCode=0 Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.618575 4789 generic.go:334] "Generic (PLEG): container finished" podID="c7666b80-e60e-4038-9921-53415db91cef" containerID="88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3" exitCode=143 Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.618586 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7666b80-e60e-4038-9921-53415db91cef","Type":"ContainerDied","Data":"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2"} Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.618632 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7666b80-e60e-4038-9921-53415db91cef","Type":"ContainerDied","Data":"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3"} Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.618641 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.618666 4789 scope.go:117] "RemoveContainer" containerID="a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.618651 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7666b80-e60e-4038-9921-53415db91cef","Type":"ContainerDied","Data":"76f22d7ab0d9196d79388a8444646b1873644d2cacbd06a9c15da48bcc8b1a57"} Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.645552 4789 scope.go:117] "RemoveContainer" containerID="88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.666755 4789 scope.go:117] "RemoveContainer" containerID="a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2" Dec 16 07:14:35 crc kubenswrapper[4789]: E1216 07:14:35.667281 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2\": container with ID starting with a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2 not found: ID does not exist" containerID="a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.667338 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2"} err="failed to get container status \"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2\": rpc error: code = NotFound desc = could not find container \"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2\": container with ID starting with a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2 not found: ID does not exist" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.667367 4789 scope.go:117] "RemoveContainer" containerID="88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3" Dec 16 07:14:35 crc kubenswrapper[4789]: E1216 07:14:35.668370 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3\": container with ID starting with 88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3 not found: ID does not exist" containerID="88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.668404 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3"} err="failed to get container status \"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3\": rpc error: code = NotFound desc = could not find container \"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3\": container with ID starting with 88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3 not found: ID does not exist" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.668430 4789 scope.go:117] "RemoveContainer" containerID="a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.668774 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2"} err="failed to get container status \"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2\": rpc error: code = NotFound desc = could not find container \"a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2\": container with ID starting with a9b8dbc2b5c4d23108c435c858469d0801506471630a84096f594cd56aae97b2 not found: ID does not exist" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.668799 4789 scope.go:117] "RemoveContainer" containerID="88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.669189 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3"} err="failed to get container status \"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3\": rpc error: code = NotFound desc = could not find container \"88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3\": container with ID starting with 88579b0312678032a55ee145a85fa92b01a64eaf62c792a03816cf08215520f3 not found: ID does not exist" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.707810 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpm5j\" (UniqueName: \"kubernetes.io/projected/c7666b80-e60e-4038-9921-53415db91cef-kube-api-access-wpm5j\") pod \"c7666b80-e60e-4038-9921-53415db91cef\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.708015 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7666b80-e60e-4038-9921-53415db91cef-logs\") pod \"c7666b80-e60e-4038-9921-53415db91cef\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.708078 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-combined-ca-bundle\") pod \"c7666b80-e60e-4038-9921-53415db91cef\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.708166 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data\") pod \"c7666b80-e60e-4038-9921-53415db91cef\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.709671 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7666b80-e60e-4038-9921-53415db91cef-logs" (OuterVolumeSpecName: "logs") pod "c7666b80-e60e-4038-9921-53415db91cef" (UID: "c7666b80-e60e-4038-9921-53415db91cef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.717087 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7666b80-e60e-4038-9921-53415db91cef-kube-api-access-wpm5j" (OuterVolumeSpecName: "kube-api-access-wpm5j") pod "c7666b80-e60e-4038-9921-53415db91cef" (UID: "c7666b80-e60e-4038-9921-53415db91cef"). InnerVolumeSpecName "kube-api-access-wpm5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:35 crc kubenswrapper[4789]: E1216 07:14:35.742370 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data podName:c7666b80-e60e-4038-9921-53415db91cef nodeName:}" failed. No retries permitted until 2025-12-16 07:14:36.242321836 +0000 UTC m=+1414.504209465 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data") pod "c7666b80-e60e-4038-9921-53415db91cef" (UID: "c7666b80-e60e-4038-9921-53415db91cef") : error deleting /var/lib/kubelet/pods/c7666b80-e60e-4038-9921-53415db91cef/volume-subpaths: remove /var/lib/kubelet/pods/c7666b80-e60e-4038-9921-53415db91cef/volume-subpaths: no such file or directory Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.745759 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7666b80-e60e-4038-9921-53415db91cef" (UID: "c7666b80-e60e-4038-9921-53415db91cef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.810598 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7666b80-e60e-4038-9921-53415db91cef-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.810637 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:35 crc kubenswrapper[4789]: I1216 07:14:35.810651 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpm5j\" (UniqueName: \"kubernetes.io/projected/c7666b80-e60e-4038-9921-53415db91cef-kube-api-access-wpm5j\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.320063 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data\") pod \"c7666b80-e60e-4038-9921-53415db91cef\" (UID: \"c7666b80-e60e-4038-9921-53415db91cef\") " Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.323965 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data" (OuterVolumeSpecName: "config-data") pod "c7666b80-e60e-4038-9921-53415db91cef" (UID: "c7666b80-e60e-4038-9921-53415db91cef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.422737 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7666b80-e60e-4038-9921-53415db91cef-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.551107 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.561316 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.579611 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:36 crc kubenswrapper[4789]: E1216 07:14:36.580002 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-metadata" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.580024 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-metadata" Dec 16 07:14:36 crc kubenswrapper[4789]: E1216 07:14:36.580059 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-log" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.580064 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-log" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.580223 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-log" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.580242 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7666b80-e60e-4038-9921-53415db91cef" containerName="nova-metadata-metadata" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.581416 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.585162 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.592122 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.593568 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.695124 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.731552 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.731643 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae3dfe-6d94-433a-9843-81f8a00ffaac-logs\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.731728 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-config-data\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.732099 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fblpx\" (UniqueName: \"kubernetes.io/projected/14ae3dfe-6d94-433a-9843-81f8a00ffaac-kube-api-access-fblpx\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.732263 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.833839 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fblpx\" (UniqueName: \"kubernetes.io/projected/14ae3dfe-6d94-433a-9843-81f8a00ffaac-kube-api-access-fblpx\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.833927 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.833965 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.834006 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae3dfe-6d94-433a-9843-81f8a00ffaac-logs\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.834093 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-config-data\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.834642 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae3dfe-6d94-433a-9843-81f8a00ffaac-logs\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.840469 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.840627 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-config-data\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.840879 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.853404 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fblpx\" (UniqueName: \"kubernetes.io/projected/14ae3dfe-6d94-433a-9843-81f8a00ffaac-kube-api-access-fblpx\") pod \"nova-metadata-0\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " pod="openstack/nova-metadata-0" Dec 16 07:14:36 crc kubenswrapper[4789]: I1216 07:14:36.898090 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.382691 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:37 crc kubenswrapper[4789]: W1216 07:14:37.385519 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ae3dfe_6d94_433a_9843_81f8a00ffaac.slice/crio-d548a48130ca878afb55284b7346f99fbe9b08a1ce821fefd76ae1ce23d53597 WatchSource:0}: Error finding container d548a48130ca878afb55284b7346f99fbe9b08a1ce821fefd76ae1ce23d53597: Status 404 returned error can't find the container with id d548a48130ca878afb55284b7346f99fbe9b08a1ce821fefd76ae1ce23d53597 Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.642559 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14ae3dfe-6d94-433a-9843-81f8a00ffaac","Type":"ContainerStarted","Data":"d548a48130ca878afb55284b7346f99fbe9b08a1ce821fefd76ae1ce23d53597"} Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.698776 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.698851 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.746420 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.927491 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.927876 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 07:14:37 crc kubenswrapper[4789]: I1216 07:14:37.968506 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.116469 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7666b80-e60e-4038-9921-53415db91cef" path="/var/lib/kubelet/pods/c7666b80-e60e-4038-9921-53415db91cef/volumes" Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.474731 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.541920 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-296d6"] Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.542876 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" podUID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerName="dnsmasq-dns" containerID="cri-o://f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac" gracePeriod=10 Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.666786 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14ae3dfe-6d94-433a-9843-81f8a00ffaac","Type":"ContainerStarted","Data":"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096"} Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.666834 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14ae3dfe-6d94-433a-9843-81f8a00ffaac","Type":"ContainerStarted","Data":"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20"} Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.671661 4789 generic.go:334] "Generic (PLEG): container finished" podID="f73fa10b-54a6-4292-be91-84657f4a43cd" containerID="5363a9b917a24482d684b0616aeb1c17a20bcb0aac7b8fa2460fcb0158ec6b7d" exitCode=0 Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.671755 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4kpsb" event={"ID":"f73fa10b-54a6-4292-be91-84657f4a43cd","Type":"ContainerDied","Data":"5363a9b917a24482d684b0616aeb1c17a20bcb0aac7b8fa2460fcb0158ec6b7d"} Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.720991 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.722166 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.722146076 podStartE2EDuration="2.722146076s" podCreationTimestamp="2025-12-16 07:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:38.702384626 +0000 UTC m=+1416.964272265" watchObservedRunningTime="2025-12-16 07:14:38.722146076 +0000 UTC m=+1416.984033695" Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.783304 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:14:38 crc kubenswrapper[4789]: I1216 07:14:38.784052 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.243305 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.394438 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-svc\") pod \"b5c97ba8-23ab-45c0-82fa-4260a301b089\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.394692 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-swift-storage-0\") pod \"b5c97ba8-23ab-45c0-82fa-4260a301b089\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.394774 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-config\") pod \"b5c97ba8-23ab-45c0-82fa-4260a301b089\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.394827 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhvk9\" (UniqueName: \"kubernetes.io/projected/b5c97ba8-23ab-45c0-82fa-4260a301b089-kube-api-access-qhvk9\") pod \"b5c97ba8-23ab-45c0-82fa-4260a301b089\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.394878 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-sb\") pod \"b5c97ba8-23ab-45c0-82fa-4260a301b089\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.394939 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-nb\") pod \"b5c97ba8-23ab-45c0-82fa-4260a301b089\" (UID: \"b5c97ba8-23ab-45c0-82fa-4260a301b089\") " Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.436142 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c97ba8-23ab-45c0-82fa-4260a301b089-kube-api-access-qhvk9" (OuterVolumeSpecName: "kube-api-access-qhvk9") pod "b5c97ba8-23ab-45c0-82fa-4260a301b089" (UID: "b5c97ba8-23ab-45c0-82fa-4260a301b089"). InnerVolumeSpecName "kube-api-access-qhvk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.497300 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhvk9\" (UniqueName: \"kubernetes.io/projected/b5c97ba8-23ab-45c0-82fa-4260a301b089-kube-api-access-qhvk9\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.516823 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5c97ba8-23ab-45c0-82fa-4260a301b089" (UID: "b5c97ba8-23ab-45c0-82fa-4260a301b089"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.540389 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5c97ba8-23ab-45c0-82fa-4260a301b089" (UID: "b5c97ba8-23ab-45c0-82fa-4260a301b089"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.559400 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-config" (OuterVolumeSpecName: "config") pod "b5c97ba8-23ab-45c0-82fa-4260a301b089" (UID: "b5c97ba8-23ab-45c0-82fa-4260a301b089"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.560433 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5c97ba8-23ab-45c0-82fa-4260a301b089" (UID: "b5c97ba8-23ab-45c0-82fa-4260a301b089"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.593129 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5c97ba8-23ab-45c0-82fa-4260a301b089" (UID: "b5c97ba8-23ab-45c0-82fa-4260a301b089"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.598981 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.599019 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.599034 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.599047 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.599060 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c97ba8-23ab-45c0-82fa-4260a301b089-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.682163 4789 generic.go:334] "Generic (PLEG): container finished" podID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerID="f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac" exitCode=0 Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.683259 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.686297 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" event={"ID":"b5c97ba8-23ab-45c0-82fa-4260a301b089","Type":"ContainerDied","Data":"f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac"} Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.686364 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dbb546bf-296d6" event={"ID":"b5c97ba8-23ab-45c0-82fa-4260a301b089","Type":"ContainerDied","Data":"ec192f7b6089cf9d77b9c1a03241fe97cf527c845504ce7c5564b0b43bc5048f"} Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.686391 4789 scope.go:117] "RemoveContainer" containerID="f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.719972 4789 scope.go:117] "RemoveContainer" containerID="d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.731965 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-296d6"] Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.741630 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75dbb546bf-296d6"] Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.750364 4789 scope.go:117] "RemoveContainer" containerID="f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac" Dec 16 07:14:39 crc kubenswrapper[4789]: E1216 07:14:39.751429 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac\": container with ID starting with f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac not found: ID does not exist" containerID="f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.751462 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac"} err="failed to get container status \"f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac\": rpc error: code = NotFound desc = could not find container \"f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac\": container with ID starting with f7b58c5cd57163eb9fed90e1be65a2367e5130ee35cb665224d200dc71430aac not found: ID does not exist" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.751484 4789 scope.go:117] "RemoveContainer" containerID="d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322" Dec 16 07:14:39 crc kubenswrapper[4789]: E1216 07:14:39.751750 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322\": container with ID starting with d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322 not found: ID does not exist" containerID="d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322" Dec 16 07:14:39 crc kubenswrapper[4789]: I1216 07:14:39.751773 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322"} err="failed to get container status \"d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322\": rpc error: code = NotFound desc = could not find container \"d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322\": container with ID starting with d85c6453dde04a6779a75632c46647ebc6bcebeadf2600b25c4691f474adf322 not found: ID does not exist" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.029868 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.108888 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-config-data\") pod \"f73fa10b-54a6-4292-be91-84657f4a43cd\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.109000 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzhzf\" (UniqueName: \"kubernetes.io/projected/f73fa10b-54a6-4292-be91-84657f4a43cd-kube-api-access-dzhzf\") pod \"f73fa10b-54a6-4292-be91-84657f4a43cd\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.109180 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-scripts\") pod \"f73fa10b-54a6-4292-be91-84657f4a43cd\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.109393 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-combined-ca-bundle\") pod \"f73fa10b-54a6-4292-be91-84657f4a43cd\" (UID: \"f73fa10b-54a6-4292-be91-84657f4a43cd\") " Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.121101 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-scripts" (OuterVolumeSpecName: "scripts") pod "f73fa10b-54a6-4292-be91-84657f4a43cd" (UID: "f73fa10b-54a6-4292-be91-84657f4a43cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.121140 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f73fa10b-54a6-4292-be91-84657f4a43cd-kube-api-access-dzhzf" (OuterVolumeSpecName: "kube-api-access-dzhzf") pod "f73fa10b-54a6-4292-be91-84657f4a43cd" (UID: "f73fa10b-54a6-4292-be91-84657f4a43cd"). InnerVolumeSpecName "kube-api-access-dzhzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.128010 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c97ba8-23ab-45c0-82fa-4260a301b089" path="/var/lib/kubelet/pods/b5c97ba8-23ab-45c0-82fa-4260a301b089/volumes" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.151504 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f73fa10b-54a6-4292-be91-84657f4a43cd" (UID: "f73fa10b-54a6-4292-be91-84657f4a43cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.156936 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-config-data" (OuterVolumeSpecName: "config-data") pod "f73fa10b-54a6-4292-be91-84657f4a43cd" (UID: "f73fa10b-54a6-4292-be91-84657f4a43cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.212046 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.212088 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.212103 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f73fa10b-54a6-4292-be91-84657f4a43cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.212115 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzhzf\" (UniqueName: \"kubernetes.io/projected/f73fa10b-54a6-4292-be91-84657f4a43cd-kube-api-access-dzhzf\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.692755 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4kpsb" event={"ID":"f73fa10b-54a6-4292-be91-84657f4a43cd","Type":"ContainerDied","Data":"7d6e854b84daa13948281bc3da03429976c5354db524a03bf71160cd9d6ba11c"} Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.692836 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6e854b84daa13948281bc3da03429976c5354db524a03bf71160cd9d6ba11c" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.693256 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4kpsb" Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.871190 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.871718 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-log" containerID="cri-o://f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6" gracePeriod=30 Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.872254 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-api" containerID="cri-o://18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a" gracePeriod=30 Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.900512 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.900872 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="631d054c-8489-442b-915e-edf3f9a3b904" containerName="nova-scheduler-scheduler" containerID="cri-o://0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" gracePeriod=30 Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.924066 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.924701 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-metadata" containerID="cri-o://0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096" gracePeriod=30 Dec 16 07:14:40 crc kubenswrapper[4789]: I1216 07:14:40.924991 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-log" containerID="cri-o://83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20" gracePeriod=30 Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.357504 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.357949 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="02b0fdb4-d395-4464-8250-4288ca50c8de" containerName="kube-state-metrics" containerID="cri-o://c2cd1af7c1523ef803d9a8d625c1a5e7c662281e5cab347517b54666ab0a97e0" gracePeriod=30 Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.578179 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.638429 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-combined-ca-bundle\") pod \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.638790 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fblpx\" (UniqueName: \"kubernetes.io/projected/14ae3dfe-6d94-433a-9843-81f8a00ffaac-kube-api-access-fblpx\") pod \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.638832 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-config-data\") pod \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.638934 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae3dfe-6d94-433a-9843-81f8a00ffaac-logs\") pod \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.638997 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-nova-metadata-tls-certs\") pod \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\" (UID: \"14ae3dfe-6d94-433a-9843-81f8a00ffaac\") " Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.639575 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ae3dfe-6d94-433a-9843-81f8a00ffaac-logs" (OuterVolumeSpecName: "logs") pod "14ae3dfe-6d94-433a-9843-81f8a00ffaac" (UID: "14ae3dfe-6d94-433a-9843-81f8a00ffaac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.644624 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ae3dfe-6d94-433a-9843-81f8a00ffaac-kube-api-access-fblpx" (OuterVolumeSpecName: "kube-api-access-fblpx") pod "14ae3dfe-6d94-433a-9843-81f8a00ffaac" (UID: "14ae3dfe-6d94-433a-9843-81f8a00ffaac"). InnerVolumeSpecName "kube-api-access-fblpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.687176 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-config-data" (OuterVolumeSpecName: "config-data") pod "14ae3dfe-6d94-433a-9843-81f8a00ffaac" (UID: "14ae3dfe-6d94-433a-9843-81f8a00ffaac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.716974 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14ae3dfe-6d94-433a-9843-81f8a00ffaac" (UID: "14ae3dfe-6d94-433a-9843-81f8a00ffaac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.733211 4789 generic.go:334] "Generic (PLEG): container finished" podID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerID="f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6" exitCode=143 Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.733331 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dac0e455-b023-44ce-8ba6-9fd520f1e0fb","Type":"ContainerDied","Data":"f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6"} Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.736282 4789 generic.go:334] "Generic (PLEG): container finished" podID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerID="0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096" exitCode=0 Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.736306 4789 generic.go:334] "Generic (PLEG): container finished" podID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerID="83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20" exitCode=143 Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.736390 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.737143 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14ae3dfe-6d94-433a-9843-81f8a00ffaac","Type":"ContainerDied","Data":"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096"} Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.737169 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14ae3dfe-6d94-433a-9843-81f8a00ffaac","Type":"ContainerDied","Data":"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20"} Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.737180 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14ae3dfe-6d94-433a-9843-81f8a00ffaac","Type":"ContainerDied","Data":"d548a48130ca878afb55284b7346f99fbe9b08a1ce821fefd76ae1ce23d53597"} Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.737196 4789 scope.go:117] "RemoveContainer" containerID="0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.744642 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.744715 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae3dfe-6d94-433a-9843-81f8a00ffaac-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.744726 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.744739 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fblpx\" (UniqueName: \"kubernetes.io/projected/14ae3dfe-6d94-433a-9843-81f8a00ffaac-kube-api-access-fblpx\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.750116 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "14ae3dfe-6d94-433a-9843-81f8a00ffaac" (UID: "14ae3dfe-6d94-433a-9843-81f8a00ffaac"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.750310 4789 generic.go:334] "Generic (PLEG): container finished" podID="02b0fdb4-d395-4464-8250-4288ca50c8de" containerID="c2cd1af7c1523ef803d9a8d625c1a5e7c662281e5cab347517b54666ab0a97e0" exitCode=2 Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.750354 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"02b0fdb4-d395-4464-8250-4288ca50c8de","Type":"ContainerDied","Data":"c2cd1af7c1523ef803d9a8d625c1a5e7c662281e5cab347517b54666ab0a97e0"} Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.800149 4789 scope.go:117] "RemoveContainer" containerID="83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.823669 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.824668 4789 scope.go:117] "RemoveContainer" containerID="0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096" Dec 16 07:14:41 crc kubenswrapper[4789]: E1216 07:14:41.824877 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096\": container with ID starting with 0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096 not found: ID does not exist" containerID="0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.824971 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096"} err="failed to get container status \"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096\": rpc error: code = NotFound desc = could not find container \"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096\": container with ID starting with 0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096 not found: ID does not exist" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.824991 4789 scope.go:117] "RemoveContainer" containerID="83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20" Dec 16 07:14:41 crc kubenswrapper[4789]: E1216 07:14:41.825957 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20\": container with ID starting with 83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20 not found: ID does not exist" containerID="83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.826019 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20"} err="failed to get container status \"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20\": rpc error: code = NotFound desc = could not find container \"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20\": container with ID starting with 83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20 not found: ID does not exist" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.826046 4789 scope.go:117] "RemoveContainer" containerID="0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.826408 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096"} err="failed to get container status \"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096\": rpc error: code = NotFound desc = could not find container \"0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096\": container with ID starting with 0de8fd4ad214b1bf1187ec29eb4257725696cbbe47c9aa9023a61bf3657dd096 not found: ID does not exist" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.826437 4789 scope.go:117] "RemoveContainer" containerID="83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.826718 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20"} err="failed to get container status \"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20\": rpc error: code = NotFound desc = could not find container \"83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20\": container with ID starting with 83b553a5ba7ae127bea264943ff35aebaf066d9c46547113eabadd4c42fbfd20 not found: ID does not exist" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.846077 4789 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae3dfe-6d94-433a-9843-81f8a00ffaac-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.947077 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxxmz\" (UniqueName: \"kubernetes.io/projected/02b0fdb4-d395-4464-8250-4288ca50c8de-kube-api-access-mxxmz\") pod \"02b0fdb4-d395-4464-8250-4288ca50c8de\" (UID: \"02b0fdb4-d395-4464-8250-4288ca50c8de\") " Dec 16 07:14:41 crc kubenswrapper[4789]: I1216 07:14:41.950145 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b0fdb4-d395-4464-8250-4288ca50c8de-kube-api-access-mxxmz" (OuterVolumeSpecName: "kube-api-access-mxxmz") pod "02b0fdb4-d395-4464-8250-4288ca50c8de" (UID: "02b0fdb4-d395-4464-8250-4288ca50c8de"). InnerVolumeSpecName "kube-api-access-mxxmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.050464 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxxmz\" (UniqueName: \"kubernetes.io/projected/02b0fdb4-d395-4464-8250-4288ca50c8de-kube-api-access-mxxmz\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.078555 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.099057 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.132321 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" path="/var/lib/kubelet/pods/14ae3dfe-6d94-433a-9843-81f8a00ffaac/volumes" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133177 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.133504 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b0fdb4-d395-4464-8250-4288ca50c8de" containerName="kube-state-metrics" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133523 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b0fdb4-d395-4464-8250-4288ca50c8de" containerName="kube-state-metrics" Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.133538 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerName="init" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133548 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerName="init" Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.133559 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-log" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133565 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-log" Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.133597 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f73fa10b-54a6-4292-be91-84657f4a43cd" containerName="nova-manage" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133603 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f73fa10b-54a6-4292-be91-84657f4a43cd" containerName="nova-manage" Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.133620 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-metadata" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133627 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-metadata" Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.133639 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerName="dnsmasq-dns" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133646 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerName="dnsmasq-dns" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133808 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-log" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133822 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b0fdb4-d395-4464-8250-4288ca50c8de" containerName="kube-state-metrics" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133838 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ae3dfe-6d94-433a-9843-81f8a00ffaac" containerName="nova-metadata-metadata" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133850 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f73fa10b-54a6-4292-be91-84657f4a43cd" containerName="nova-manage" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.133862 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c97ba8-23ab-45c0-82fa-4260a301b089" containerName="dnsmasq-dns" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.135174 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.140521 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.141344 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.141400 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.254167 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01cde849-67ae-4c7f-b288-04aa02b02fc9-logs\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.254654 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.254695 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-config-data\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.254797 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwlx\" (UniqueName: \"kubernetes.io/projected/01cde849-67ae-4c7f-b288-04aa02b02fc9-kube-api-access-4zwlx\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.254823 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.356625 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwlx\" (UniqueName: \"kubernetes.io/projected/01cde849-67ae-4c7f-b288-04aa02b02fc9-kube-api-access-4zwlx\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.356986 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.357114 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01cde849-67ae-4c7f-b288-04aa02b02fc9-logs\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.357254 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.357354 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-config-data\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.357538 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01cde849-67ae-4c7f-b288-04aa02b02fc9-logs\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.360739 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.361243 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-config-data\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.362290 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.373407 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwlx\" (UniqueName: \"kubernetes.io/projected/01cde849-67ae-4c7f-b288-04aa02b02fc9-kube-api-access-4zwlx\") pod \"nova-metadata-0\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.531889 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.762157 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"02b0fdb4-d395-4464-8250-4288ca50c8de","Type":"ContainerDied","Data":"f63dd86fd6f300df198f61c2064f8ec952090ae09cee0dcd1fdf84aa6369e100"} Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.762437 4789 scope.go:117] "RemoveContainer" containerID="c2cd1af7c1523ef803d9a8d625c1a5e7c662281e5cab347517b54666ab0a97e0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.762174 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.764982 4789 generic.go:334] "Generic (PLEG): container finished" podID="e48c05ec-30ab-4ea1-a542-35bf74481375" containerID="ea367b9824a152f5e9f0d2ff9d6db99fdb1d7a70ba0b2b4e1be6b907ac9c6eb4" exitCode=0 Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.765021 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" event={"ID":"e48c05ec-30ab-4ea1-a542-35bf74481375","Type":"ContainerDied","Data":"ea367b9824a152f5e9f0d2ff9d6db99fdb1d7a70ba0b2b4e1be6b907ac9c6eb4"} Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.873593 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.888450 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.900043 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.901338 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.903806 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.904049 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.908265 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.932441 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10 is running failed: container process not found" containerID="0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.935187 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10 is running failed: container process not found" containerID="0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.935430 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10 is running failed: container process not found" containerID="0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:14:42 crc kubenswrapper[4789]: E1216 07:14:42.935488 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="631d054c-8489-442b-915e-edf3f9a3b904" containerName="nova-scheduler-scheduler" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.970436 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.970505 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.970799 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.977473 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwllj\" (UniqueName: \"kubernetes.io/projected/29706741-1258-454c-968f-836e472cb685-kube-api-access-pwllj\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:42 crc kubenswrapper[4789]: I1216 07:14:42.982765 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:14:42 crc kubenswrapper[4789]: W1216 07:14:42.985132 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01cde849_67ae_4c7f_b288_04aa02b02fc9.slice/crio-52caaac66d6d6a042fb84ff33b35aa02755add4845f493d10ec9cf3f4873f637 WatchSource:0}: Error finding container 52caaac66d6d6a042fb84ff33b35aa02755add4845f493d10ec9cf3f4873f637: Status 404 returned error can't find the container with id 52caaac66d6d6a042fb84ff33b35aa02755add4845f493d10ec9cf3f4873f637 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.079484 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.079838 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.079929 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.080013 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwllj\" (UniqueName: \"kubernetes.io/projected/29706741-1258-454c-968f-836e472cb685-kube-api-access-pwllj\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.086249 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.086480 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.088401 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.096982 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwllj\" (UniqueName: \"kubernetes.io/projected/29706741-1258-454c-968f-836e472cb685-kube-api-access-pwllj\") pod \"kube-state-metrics-0\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.152517 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.182368 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-combined-ca-bundle\") pod \"631d054c-8489-442b-915e-edf3f9a3b904\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.182499 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9c9l\" (UniqueName: \"kubernetes.io/projected/631d054c-8489-442b-915e-edf3f9a3b904-kube-api-access-p9c9l\") pod \"631d054c-8489-442b-915e-edf3f9a3b904\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.182607 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-config-data\") pod \"631d054c-8489-442b-915e-edf3f9a3b904\" (UID: \"631d054c-8489-442b-915e-edf3f9a3b904\") " Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.186705 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631d054c-8489-442b-915e-edf3f9a3b904-kube-api-access-p9c9l" (OuterVolumeSpecName: "kube-api-access-p9c9l") pod "631d054c-8489-442b-915e-edf3f9a3b904" (UID: "631d054c-8489-442b-915e-edf3f9a3b904"). InnerVolumeSpecName "kube-api-access-p9c9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.219239 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "631d054c-8489-442b-915e-edf3f9a3b904" (UID: "631d054c-8489-442b-915e-edf3f9a3b904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.223549 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.233569 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-config-data" (OuterVolumeSpecName: "config-data") pod "631d054c-8489-442b-915e-edf3f9a3b904" (UID: "631d054c-8489-442b-915e-edf3f9a3b904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.285477 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9c9l\" (UniqueName: \"kubernetes.io/projected/631d054c-8489-442b-915e-edf3f9a3b904-kube-api-access-p9c9l\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.285515 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.285526 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/631d054c-8489-442b-915e-edf3f9a3b904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.375771 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.376051 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-central-agent" containerID="cri-o://b79cb583fbbb85620150b995e02876362ccfffacd2f652db343ed7fdf4bbd7ea" gracePeriod=30 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.376456 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="proxy-httpd" containerID="cri-o://475e135f8166ce02f93701cd667929e5a6db20de00a10020d3974c0a405a443c" gracePeriod=30 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.376501 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="sg-core" containerID="cri-o://6caaba2e9caa91981b19f656daa04c6280bee014350ca8ad8aaae1e9f5256994" gracePeriod=30 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.376537 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-notification-agent" containerID="cri-o://f65d9f5c61e599c970b9cdce1149af994d2d92774c51a54cfeb707abc5e630e6" gracePeriod=30 Dec 16 07:14:43 crc kubenswrapper[4789]: W1216 07:14:43.700486 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29706741_1258_454c_968f_836e472cb685.slice/crio-83d208ca2b5fca84de26a8f9dde0666c1e7b4c4ed9527c11423f74e502920589 WatchSource:0}: Error finding container 83d208ca2b5fca84de26a8f9dde0666c1e7b4c4ed9527c11423f74e502920589: Status 404 returned error can't find the container with id 83d208ca2b5fca84de26a8f9dde0666c1e7b4c4ed9527c11423f74e502920589 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.703577 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.773543 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29706741-1258-454c-968f-836e472cb685","Type":"ContainerStarted","Data":"83d208ca2b5fca84de26a8f9dde0666c1e7b4c4ed9527c11423f74e502920589"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.775771 4789 generic.go:334] "Generic (PLEG): container finished" podID="631d054c-8489-442b-915e-edf3f9a3b904" containerID="0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" exitCode=0 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.775832 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"631d054c-8489-442b-915e-edf3f9a3b904","Type":"ContainerDied","Data":"0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.775836 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.775867 4789 scope.go:117] "RemoveContainer" containerID="0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.775855 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"631d054c-8489-442b-915e-edf3f9a3b904","Type":"ContainerDied","Data":"b14b4ad6893468dbef46c4e27015c7a55e9f11ec880935efb6e1a7d662985be3"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.780656 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01cde849-67ae-4c7f-b288-04aa02b02fc9","Type":"ContainerStarted","Data":"a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.780987 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01cde849-67ae-4c7f-b288-04aa02b02fc9","Type":"ContainerStarted","Data":"1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.781025 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01cde849-67ae-4c7f-b288-04aa02b02fc9","Type":"ContainerStarted","Data":"52caaac66d6d6a042fb84ff33b35aa02755add4845f493d10ec9cf3f4873f637"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.785659 4789 generic.go:334] "Generic (PLEG): container finished" podID="787226ff-bf02-4615-91b7-e5aa06525027" containerID="475e135f8166ce02f93701cd667929e5a6db20de00a10020d3974c0a405a443c" exitCode=0 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.785686 4789 generic.go:334] "Generic (PLEG): container finished" podID="787226ff-bf02-4615-91b7-e5aa06525027" containerID="6caaba2e9caa91981b19f656daa04c6280bee014350ca8ad8aaae1e9f5256994" exitCode=2 Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.785857 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerDied","Data":"475e135f8166ce02f93701cd667929e5a6db20de00a10020d3974c0a405a443c"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.785884 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerDied","Data":"6caaba2e9caa91981b19f656daa04c6280bee014350ca8ad8aaae1e9f5256994"} Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.801976 4789 scope.go:117] "RemoveContainer" containerID="0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" Dec 16 07:14:43 crc kubenswrapper[4789]: E1216 07:14:43.805396 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10\": container with ID starting with 0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10 not found: ID does not exist" containerID="0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.805484 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10"} err="failed to get container status \"0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10\": rpc error: code = NotFound desc = could not find container \"0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10\": container with ID starting with 0e6257e22d39939af161462fa1a494dd7177c78817d3f7222204cf23e6dcba10 not found: ID does not exist" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.808892 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.808870723 podStartE2EDuration="1.808870723s" podCreationTimestamp="2025-12-16 07:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:43.802400646 +0000 UTC m=+1422.064288285" watchObservedRunningTime="2025-12-16 07:14:43.808870723 +0000 UTC m=+1422.070758352" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.829009 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.848061 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.860585 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:43 crc kubenswrapper[4789]: E1216 07:14:43.861045 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631d054c-8489-442b-915e-edf3f9a3b904" containerName="nova-scheduler-scheduler" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.861064 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="631d054c-8489-442b-915e-edf3f9a3b904" containerName="nova-scheduler-scheduler" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.861260 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="631d054c-8489-442b-915e-edf3f9a3b904" containerName="nova-scheduler-scheduler" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.861886 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.864251 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.874413 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.904873 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-config-data\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.905082 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4kdh\" (UniqueName: \"kubernetes.io/projected/00cf1f8f-f55c-445b-9338-b09efa1be344-kube-api-access-z4kdh\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:43 crc kubenswrapper[4789]: I1216 07:14:43.905118 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.009002 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-config-data\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.009148 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4kdh\" (UniqueName: \"kubernetes.io/projected/00cf1f8f-f55c-445b-9338-b09efa1be344-kube-api-access-z4kdh\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.009188 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.022018 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-config-data\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.023778 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.035139 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4kdh\" (UniqueName: \"kubernetes.io/projected/00cf1f8f-f55c-445b-9338-b09efa1be344-kube-api-access-z4kdh\") pod \"nova-scheduler-0\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.124464 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b0fdb4-d395-4464-8250-4288ca50c8de" path="/var/lib/kubelet/pods/02b0fdb4-d395-4464-8250-4288ca50c8de/volumes" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.125131 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631d054c-8489-442b-915e-edf3f9a3b904" path="/var/lib/kubelet/pods/631d054c-8489-442b-915e-edf3f9a3b904/volumes" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.129255 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.183122 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.211399 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-config-data\") pod \"e48c05ec-30ab-4ea1-a542-35bf74481375\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.211466 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtrv4\" (UniqueName: \"kubernetes.io/projected/e48c05ec-30ab-4ea1-a542-35bf74481375-kube-api-access-qtrv4\") pod \"e48c05ec-30ab-4ea1-a542-35bf74481375\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.211546 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-combined-ca-bundle\") pod \"e48c05ec-30ab-4ea1-a542-35bf74481375\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.211768 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-scripts\") pod \"e48c05ec-30ab-4ea1-a542-35bf74481375\" (UID: \"e48c05ec-30ab-4ea1-a542-35bf74481375\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.227722 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48c05ec-30ab-4ea1-a542-35bf74481375-kube-api-access-qtrv4" (OuterVolumeSpecName: "kube-api-access-qtrv4") pod "e48c05ec-30ab-4ea1-a542-35bf74481375" (UID: "e48c05ec-30ab-4ea1-a542-35bf74481375"). InnerVolumeSpecName "kube-api-access-qtrv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.234476 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-scripts" (OuterVolumeSpecName: "scripts") pod "e48c05ec-30ab-4ea1-a542-35bf74481375" (UID: "e48c05ec-30ab-4ea1-a542-35bf74481375"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.279858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e48c05ec-30ab-4ea1-a542-35bf74481375" (UID: "e48c05ec-30ab-4ea1-a542-35bf74481375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.291104 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-config-data" (OuterVolumeSpecName: "config-data") pod "e48c05ec-30ab-4ea1-a542-35bf74481375" (UID: "e48c05ec-30ab-4ea1-a542-35bf74481375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.313813 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.314191 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.314290 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtrv4\" (UniqueName: \"kubernetes.io/projected/e48c05ec-30ab-4ea1-a542-35bf74481375-kube-api-access-qtrv4\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.314373 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48c05ec-30ab-4ea1-a542-35bf74481375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.618335 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.726725 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-config-data\") pod \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.726878 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle\") pod \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.726970 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-logs\") pod \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.727051 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh6s6\" (UniqueName: \"kubernetes.io/projected/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-kube-api-access-nh6s6\") pod \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.727713 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-logs" (OuterVolumeSpecName: "logs") pod "dac0e455-b023-44ce-8ba6-9fd520f1e0fb" (UID: "dac0e455-b023-44ce-8ba6-9fd520f1e0fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.727862 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.731551 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-kube-api-access-nh6s6" (OuterVolumeSpecName: "kube-api-access-nh6s6") pod "dac0e455-b023-44ce-8ba6-9fd520f1e0fb" (UID: "dac0e455-b023-44ce-8ba6-9fd520f1e0fb"). InnerVolumeSpecName "kube-api-access-nh6s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:44 crc kubenswrapper[4789]: E1216 07:14:44.757746 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle podName:dac0e455-b023-44ce-8ba6-9fd520f1e0fb nodeName:}" failed. No retries permitted until 2025-12-16 07:14:45.25771814 +0000 UTC m=+1423.519605769 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle") pod "dac0e455-b023-44ce-8ba6-9fd520f1e0fb" (UID: "dac0e455-b023-44ce-8ba6-9fd520f1e0fb") : error deleting /var/lib/kubelet/pods/dac0e455-b023-44ce-8ba6-9fd520f1e0fb/volume-subpaths: remove /var/lib/kubelet/pods/dac0e455-b023-44ce-8ba6-9fd520f1e0fb/volume-subpaths: no such file or directory Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.761304 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-config-data" (OuterVolumeSpecName: "config-data") pod "dac0e455-b023-44ce-8ba6-9fd520f1e0fb" (UID: "dac0e455-b023-44ce-8ba6-9fd520f1e0fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:44 crc kubenswrapper[4789]: W1216 07:14:44.766736 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00cf1f8f_f55c_445b_9338_b09efa1be344.slice/crio-54e678e4bf9b70b66c57acc736c084738a526394b2b941fb10a04b2874f10834 WatchSource:0}: Error finding container 54e678e4bf9b70b66c57acc736c084738a526394b2b941fb10a04b2874f10834: Status 404 returned error can't find the container with id 54e678e4bf9b70b66c57acc736c084738a526394b2b941fb10a04b2874f10834 Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.767042 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.807348 4789 generic.go:334] "Generic (PLEG): container finished" podID="787226ff-bf02-4615-91b7-e5aa06525027" containerID="b79cb583fbbb85620150b995e02876362ccfffacd2f652db343ed7fdf4bbd7ea" exitCode=0 Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.808238 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerDied","Data":"b79cb583fbbb85620150b995e02876362ccfffacd2f652db343ed7fdf4bbd7ea"} Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.820071 4789 generic.go:334] "Generic (PLEG): container finished" podID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerID="18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a" exitCode=0 Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.820145 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dac0e455-b023-44ce-8ba6-9fd520f1e0fb","Type":"ContainerDied","Data":"18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a"} Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.820160 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.820182 4789 scope.go:117] "RemoveContainer" containerID="18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.820171 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dac0e455-b023-44ce-8ba6-9fd520f1e0fb","Type":"ContainerDied","Data":"83ca8a2ebf72c5e7ab3072f13079e0dbc0ff260c7050716a11f7876caf129cb6"} Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.826242 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29706741-1258-454c-968f-836e472cb685","Type":"ContainerStarted","Data":"a0a363829296ba32ca72eb47110c7682a3a5f2c237669efd7a55541ecf92e1ab"} Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.826307 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.828248 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"00cf1f8f-f55c-445b-9338-b09efa1be344","Type":"ContainerStarted","Data":"54e678e4bf9b70b66c57acc736c084738a526394b2b941fb10a04b2874f10834"} Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.829788 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.830130 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh6s6\" (UniqueName: \"kubernetes.io/projected/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-kube-api-access-nh6s6\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.841855 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.845850 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xtjrv" event={"ID":"e48c05ec-30ab-4ea1-a542-35bf74481375","Type":"ContainerDied","Data":"159c09e9fa11825b995a5ef021c8f71004f34b622507e890fb0f7ddb3785458e"} Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.845903 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159c09e9fa11825b995a5ef021c8f71004f34b622507e890fb0f7ddb3785458e" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.865551 4789 scope.go:117] "RemoveContainer" containerID="f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.868532 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:14:44 crc kubenswrapper[4789]: E1216 07:14:44.869026 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48c05ec-30ab-4ea1-a542-35bf74481375" containerName="nova-cell1-conductor-db-sync" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.869050 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48c05ec-30ab-4ea1-a542-35bf74481375" containerName="nova-cell1-conductor-db-sync" Dec 16 07:14:44 crc kubenswrapper[4789]: E1216 07:14:44.869086 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-api" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.869093 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-api" Dec 16 07:14:44 crc kubenswrapper[4789]: E1216 07:14:44.869102 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-log" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.869109 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-log" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.869272 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-log" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.869283 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48c05ec-30ab-4ea1-a542-35bf74481375" containerName="nova-cell1-conductor-db-sync" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.869293 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" containerName="nova-api-api" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.869879 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.892460 4789 scope.go:117] "RemoveContainer" containerID="18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.892579 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 07:14:44 crc kubenswrapper[4789]: E1216 07:14:44.894842 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a\": container with ID starting with 18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a not found: ID does not exist" containerID="18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.894899 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a"} err="failed to get container status \"18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a\": rpc error: code = NotFound desc = could not find container \"18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a\": container with ID starting with 18a7763949845eab35efcbe90a46238b25b1d069b0def7b6781755014a81454a not found: ID does not exist" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.894950 4789 scope.go:117] "RemoveContainer" containerID="f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6" Dec 16 07:14:44 crc kubenswrapper[4789]: E1216 07:14:44.895339 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6\": container with ID starting with f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6 not found: ID does not exist" containerID="f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.895368 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6"} err="failed to get container status \"f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6\": rpc error: code = NotFound desc = could not find container \"f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6\": container with ID starting with f29657875ceb7cf81d077e435f419693e6a35f82bec5b2ba223057cd9c941db6 not found: ID does not exist" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.901730 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.486804979 podStartE2EDuration="2.901706708s" podCreationTimestamp="2025-12-16 07:14:42 +0000 UTC" firstStartedPulling="2025-12-16 07:14:43.703891903 +0000 UTC m=+1421.965779532" lastFinishedPulling="2025-12-16 07:14:44.118793632 +0000 UTC m=+1422.380681261" observedRunningTime="2025-12-16 07:14:44.852686187 +0000 UTC m=+1423.114573816" watchObservedRunningTime="2025-12-16 07:14:44.901706708 +0000 UTC m=+1423.163594357" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.926131 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.931961 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2bmd\" (UniqueName: \"kubernetes.io/projected/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-kube-api-access-m2bmd\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.932269 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:44 crc kubenswrapper[4789]: I1216 07:14:44.932304 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.033705 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.033739 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.033774 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2bmd\" (UniqueName: \"kubernetes.io/projected/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-kube-api-access-m2bmd\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.038863 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.040089 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.048697 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2bmd\" (UniqueName: \"kubernetes.io/projected/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-kube-api-access-m2bmd\") pod \"nova-cell1-conductor-0\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.287242 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.338722 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle\") pod \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\" (UID: \"dac0e455-b023-44ce-8ba6-9fd520f1e0fb\") " Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.342149 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dac0e455-b023-44ce-8ba6-9fd520f1e0fb" (UID: "dac0e455-b023-44ce-8ba6-9fd520f1e0fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.441360 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac0e455-b023-44ce-8ba6-9fd520f1e0fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.462267 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.490118 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.501847 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.503650 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.505848 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.507704 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.542539 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4k2\" (UniqueName: \"kubernetes.io/projected/292b2d90-e135-4b61-b939-b88c516cce30-kube-api-access-9c4k2\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.542644 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.542763 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/292b2d90-e135-4b61-b939-b88c516cce30-logs\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.542836 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-config-data\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.646333 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4k2\" (UniqueName: \"kubernetes.io/projected/292b2d90-e135-4b61-b939-b88c516cce30-kube-api-access-9c4k2\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.646420 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.646469 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/292b2d90-e135-4b61-b939-b88c516cce30-logs\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.646516 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-config-data\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.647651 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/292b2d90-e135-4b61-b939-b88c516cce30-logs\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.656294 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.683615 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-config-data\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.721494 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4k2\" (UniqueName: \"kubernetes.io/projected/292b2d90-e135-4b61-b939-b88c516cce30-kube-api-access-9c4k2\") pod \"nova-api-0\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.833015 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.840849 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.857749 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"00cf1f8f-f55c-445b-9338-b09efa1be344","Type":"ContainerStarted","Data":"3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee"} Dec 16 07:14:45 crc kubenswrapper[4789]: I1216 07:14:45.876685 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.87666884 podStartE2EDuration="2.87666884s" podCreationTimestamp="2025-12-16 07:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:45.871776331 +0000 UTC m=+1424.133663980" watchObservedRunningTime="2025-12-16 07:14:45.87666884 +0000 UTC m=+1424.138556469" Dec 16 07:14:46 crc kubenswrapper[4789]: I1216 07:14:46.123284 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac0e455-b023-44ce-8ba6-9fd520f1e0fb" path="/var/lib/kubelet/pods/dac0e455-b023-44ce-8ba6-9fd520f1e0fb/volumes" Dec 16 07:14:46 crc kubenswrapper[4789]: W1216 07:14:46.295970 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292b2d90_e135_4b61_b939_b88c516cce30.slice/crio-c90f89db89b3a1cef819a5cb57aebb47a730c1476349f6ba133f07ff0303e0c8 WatchSource:0}: Error finding container c90f89db89b3a1cef819a5cb57aebb47a730c1476349f6ba133f07ff0303e0c8: Status 404 returned error can't find the container with id c90f89db89b3a1cef819a5cb57aebb47a730c1476349f6ba133f07ff0303e0c8 Dec 16 07:14:46 crc kubenswrapper[4789]: I1216 07:14:46.304447 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:14:46 crc kubenswrapper[4789]: I1216 07:14:46.869027 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe","Type":"ContainerStarted","Data":"219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09"} Dec 16 07:14:46 crc kubenswrapper[4789]: I1216 07:14:46.869333 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe","Type":"ContainerStarted","Data":"fd572611943e791e5dca7dab1ae251e4ab9c924d8ab92f0a8b22914c910b1c53"} Dec 16 07:14:46 crc kubenswrapper[4789]: I1216 07:14:46.871315 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"292b2d90-e135-4b61-b939-b88c516cce30","Type":"ContainerStarted","Data":"c90f89db89b3a1cef819a5cb57aebb47a730c1476349f6ba133f07ff0303e0c8"} Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.532655 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.533037 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.882887 4789 generic.go:334] "Generic (PLEG): container finished" podID="787226ff-bf02-4615-91b7-e5aa06525027" containerID="f65d9f5c61e599c970b9cdce1149af994d2d92774c51a54cfeb707abc5e630e6" exitCode=0 Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.882976 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerDied","Data":"f65d9f5c61e599c970b9cdce1149af994d2d92774c51a54cfeb707abc5e630e6"} Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.885254 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"292b2d90-e135-4b61-b939-b88c516cce30","Type":"ContainerStarted","Data":"ac60ab81a6fe548c21807ff1f2640ee95404ec9cad922f3315427856a38dd0ed"} Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.885280 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"292b2d90-e135-4b61-b939-b88c516cce30","Type":"ContainerStarted","Data":"50258d470728eaf8d2b1970821c04e921998921d29930dc25c2327749b201d10"} Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.885666 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.914388 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.914368553 podStartE2EDuration="3.914368553s" podCreationTimestamp="2025-12-16 07:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:47.900997727 +0000 UTC m=+1426.162885346" watchObservedRunningTime="2025-12-16 07:14:47.914368553 +0000 UTC m=+1426.176256182" Dec 16 07:14:47 crc kubenswrapper[4789]: I1216 07:14:47.923801 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.923781381 podStartE2EDuration="2.923781381s" podCreationTimestamp="2025-12-16 07:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:14:47.91922796 +0000 UTC m=+1426.181115589" watchObservedRunningTime="2025-12-16 07:14:47.923781381 +0000 UTC m=+1426.185669010" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.296420 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.406529 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-scripts\") pod \"787226ff-bf02-4615-91b7-e5aa06525027\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.406581 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl2n2\" (UniqueName: \"kubernetes.io/projected/787226ff-bf02-4615-91b7-e5aa06525027-kube-api-access-vl2n2\") pod \"787226ff-bf02-4615-91b7-e5aa06525027\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.406640 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-sg-core-conf-yaml\") pod \"787226ff-bf02-4615-91b7-e5aa06525027\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.406661 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-config-data\") pod \"787226ff-bf02-4615-91b7-e5aa06525027\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.406800 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-log-httpd\") pod \"787226ff-bf02-4615-91b7-e5aa06525027\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.406871 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-run-httpd\") pod \"787226ff-bf02-4615-91b7-e5aa06525027\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.406897 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-combined-ca-bundle\") pod \"787226ff-bf02-4615-91b7-e5aa06525027\" (UID: \"787226ff-bf02-4615-91b7-e5aa06525027\") " Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.408118 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "787226ff-bf02-4615-91b7-e5aa06525027" (UID: "787226ff-bf02-4615-91b7-e5aa06525027"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.408376 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "787226ff-bf02-4615-91b7-e5aa06525027" (UID: "787226ff-bf02-4615-91b7-e5aa06525027"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.408434 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.412627 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-scripts" (OuterVolumeSpecName: "scripts") pod "787226ff-bf02-4615-91b7-e5aa06525027" (UID: "787226ff-bf02-4615-91b7-e5aa06525027"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.413148 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787226ff-bf02-4615-91b7-e5aa06525027-kube-api-access-vl2n2" (OuterVolumeSpecName: "kube-api-access-vl2n2") pod "787226ff-bf02-4615-91b7-e5aa06525027" (UID: "787226ff-bf02-4615-91b7-e5aa06525027"). InnerVolumeSpecName "kube-api-access-vl2n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.433626 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "787226ff-bf02-4615-91b7-e5aa06525027" (UID: "787226ff-bf02-4615-91b7-e5aa06525027"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.504885 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-config-data" (OuterVolumeSpecName: "config-data") pod "787226ff-bf02-4615-91b7-e5aa06525027" (UID: "787226ff-bf02-4615-91b7-e5aa06525027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.505460 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "787226ff-bf02-4615-91b7-e5aa06525027" (UID: "787226ff-bf02-4615-91b7-e5aa06525027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.510630 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/787226ff-bf02-4615-91b7-e5aa06525027-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.510661 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.510674 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.510685 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl2n2\" (UniqueName: \"kubernetes.io/projected/787226ff-bf02-4615-91b7-e5aa06525027-kube-api-access-vl2n2\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.510698 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.510708 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787226ff-bf02-4615-91b7-e5aa06525027-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.900793 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.907169 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"787226ff-bf02-4615-91b7-e5aa06525027","Type":"ContainerDied","Data":"67f22bbd18a94287a5f91c742e4faf9db04658d3f364ae8a05be53c49e2bddd2"} Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.909855 4789 scope.go:117] "RemoveContainer" containerID="475e135f8166ce02f93701cd667929e5a6db20de00a10020d3974c0a405a443c" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.936328 4789 scope.go:117] "RemoveContainer" containerID="6caaba2e9caa91981b19f656daa04c6280bee014350ca8ad8aaae1e9f5256994" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.939872 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.950090 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.962106 4789 scope.go:117] "RemoveContainer" containerID="f65d9f5c61e599c970b9cdce1149af994d2d92774c51a54cfeb707abc5e630e6" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.973248 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:48 crc kubenswrapper[4789]: E1216 07:14:48.973729 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-notification-agent" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.973752 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-notification-agent" Dec 16 07:14:48 crc kubenswrapper[4789]: E1216 07:14:48.973763 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="proxy-httpd" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.973771 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="proxy-httpd" Dec 16 07:14:48 crc kubenswrapper[4789]: E1216 07:14:48.973802 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="sg-core" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.973809 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="sg-core" Dec 16 07:14:48 crc kubenswrapper[4789]: E1216 07:14:48.973820 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-central-agent" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.973826 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-central-agent" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.974011 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-notification-agent" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.974021 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="proxy-httpd" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.974036 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="ceilometer-central-agent" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.974045 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="787226ff-bf02-4615-91b7-e5aa06525027" containerName="sg-core" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.975632 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.977318 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.978456 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.978800 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:14:48 crc kubenswrapper[4789]: I1216 07:14:48.993604 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.008679 4789 scope.go:117] "RemoveContainer" containerID="b79cb583fbbb85620150b995e02876362ccfffacd2f652db343ed7fdf4bbd7ea" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.019685 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.019834 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.019858 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-run-httpd\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.019875 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-log-httpd\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.019903 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-scripts\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.020002 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnxb9\" (UniqueName: \"kubernetes.io/projected/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-kube-api-access-xnxb9\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.020044 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.020062 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-config-data\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122173 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122257 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-run-httpd\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122288 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-log-httpd\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122324 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-scripts\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122372 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnxb9\" (UniqueName: \"kubernetes.io/projected/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-kube-api-access-xnxb9\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122406 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122428 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-config-data\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122806 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-run-httpd\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.122878 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-log-httpd\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.123547 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.127401 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.127887 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.130498 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.131669 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-config-data\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.132654 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-scripts\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.143528 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnxb9\" (UniqueName: \"kubernetes.io/projected/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-kube-api-access-xnxb9\") pod \"ceilometer-0\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.183526 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.325551 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.783122 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:14:49 crc kubenswrapper[4789]: W1216 07:14:49.786867 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbec26ed2_1e5e_4b82_9e58_3b87cfd9bf88.slice/crio-ebf04e8ef82da65397e9663cefea46b51f955ec389cbcdfeedf9af9aadec4262 WatchSource:0}: Error finding container ebf04e8ef82da65397e9663cefea46b51f955ec389cbcdfeedf9af9aadec4262: Status 404 returned error can't find the container with id ebf04e8ef82da65397e9663cefea46b51f955ec389cbcdfeedf9af9aadec4262 Dec 16 07:14:49 crc kubenswrapper[4789]: I1216 07:14:49.913681 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerStarted","Data":"ebf04e8ef82da65397e9663cefea46b51f955ec389cbcdfeedf9af9aadec4262"} Dec 16 07:14:50 crc kubenswrapper[4789]: I1216 07:14:50.116738 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787226ff-bf02-4615-91b7-e5aa06525027" path="/var/lib/kubelet/pods/787226ff-bf02-4615-91b7-e5aa06525027/volumes" Dec 16 07:14:50 crc kubenswrapper[4789]: I1216 07:14:50.922513 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerStarted","Data":"366c79e647cc93ab3227a5511aadcde1f210df15532f4364cc3433bd27f092b8"} Dec 16 07:14:51 crc kubenswrapper[4789]: I1216 07:14:51.927518 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:14:51 crc kubenswrapper[4789]: I1216 07:14:51.928632 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:14:51 crc kubenswrapper[4789]: I1216 07:14:51.932625 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerStarted","Data":"2dcf1d180d144918b38fe62db9478f8a57310e1c94553aa1c570bdf87fa3bbea"} Dec 16 07:14:52 crc kubenswrapper[4789]: I1216 07:14:52.533054 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:14:52 crc kubenswrapper[4789]: I1216 07:14:52.533424 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:14:52 crc kubenswrapper[4789]: I1216 07:14:52.943381 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerStarted","Data":"95543c0eff0750dd88b8a74582bf85ba80cd3c8e04639c50723adcf7a8277b01"} Dec 16 07:14:53 crc kubenswrapper[4789]: I1216 07:14:53.242486 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 07:14:53 crc kubenswrapper[4789]: I1216 07:14:53.552121 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:14:53 crc kubenswrapper[4789]: I1216 07:14:53.552121 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:14:54 crc kubenswrapper[4789]: I1216 07:14:54.184776 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 07:14:54 crc kubenswrapper[4789]: I1216 07:14:54.215497 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 07:14:54 crc kubenswrapper[4789]: I1216 07:14:54.970594 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerStarted","Data":"8c0b14cdebc66dcba62089b3051a5cea447ed0938b79bbaf1a2427dd421c8556"} Dec 16 07:14:54 crc kubenswrapper[4789]: I1216 07:14:54.971030 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:14:54 crc kubenswrapper[4789]: I1216 07:14:54.999009 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.080998741 podStartE2EDuration="6.998988326s" podCreationTimestamp="2025-12-16 07:14:48 +0000 UTC" firstStartedPulling="2025-12-16 07:14:49.789076565 +0000 UTC m=+1428.050964194" lastFinishedPulling="2025-12-16 07:14:53.70706615 +0000 UTC m=+1431.968953779" observedRunningTime="2025-12-16 07:14:54.989038823 +0000 UTC m=+1433.250926452" watchObservedRunningTime="2025-12-16 07:14:54.998988326 +0000 UTC m=+1433.260875955" Dec 16 07:14:55 crc kubenswrapper[4789]: I1216 07:14:55.004569 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 07:14:55 crc kubenswrapper[4789]: I1216 07:14:55.315556 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 07:14:55 crc kubenswrapper[4789]: I1216 07:14:55.833485 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:14:55 crc kubenswrapper[4789]: I1216 07:14:55.835617 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:14:56 crc kubenswrapper[4789]: I1216 07:14:56.917115 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:14:56 crc kubenswrapper[4789]: I1216 07:14:56.917131 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.143160 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4"] Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.145264 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.149785 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.150932 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.158895 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4"] Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.285511 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e7e593c-0688-4ff7-b959-37f36a74aa2b-config-volume\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.285865 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e7e593c-0688-4ff7-b959-37f36a74aa2b-secret-volume\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.286011 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztrps\" (UniqueName: \"kubernetes.io/projected/1e7e593c-0688-4ff7-b959-37f36a74aa2b-kube-api-access-ztrps\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.388392 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e7e593c-0688-4ff7-b959-37f36a74aa2b-config-volume\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.388934 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e7e593c-0688-4ff7-b959-37f36a74aa2b-secret-volume\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.389145 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrps\" (UniqueName: \"kubernetes.io/projected/1e7e593c-0688-4ff7-b959-37f36a74aa2b-kube-api-access-ztrps\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.389402 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e7e593c-0688-4ff7-b959-37f36a74aa2b-config-volume\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.395382 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e7e593c-0688-4ff7-b959-37f36a74aa2b-secret-volume\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.408109 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztrps\" (UniqueName: \"kubernetes.io/projected/1e7e593c-0688-4ff7-b959-37f36a74aa2b-kube-api-access-ztrps\") pod \"collect-profiles-29431155-g4cb4\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.497604 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:00 crc kubenswrapper[4789]: I1216 07:15:00.928520 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4"] Dec 16 07:15:00 crc kubenswrapper[4789]: W1216 07:15:00.930340 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7e593c_0688_4ff7_b959_37f36a74aa2b.slice/crio-be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5 WatchSource:0}: Error finding container be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5: Status 404 returned error can't find the container with id be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5 Dec 16 07:15:01 crc kubenswrapper[4789]: I1216 07:15:01.020784 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" event={"ID":"1e7e593c-0688-4ff7-b959-37f36a74aa2b","Type":"ContainerStarted","Data":"be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5"} Dec 16 07:15:02 crc kubenswrapper[4789]: I1216 07:15:02.030940 4789 generic.go:334] "Generic (PLEG): container finished" podID="1e7e593c-0688-4ff7-b959-37f36a74aa2b" containerID="99834c5c917095ba527996933ce5acb86ed694cf016c7bc85b9712ee416f3bd3" exitCode=0 Dec 16 07:15:02 crc kubenswrapper[4789]: I1216 07:15:02.031022 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" event={"ID":"1e7e593c-0688-4ff7-b959-37f36a74aa2b","Type":"ContainerDied","Data":"99834c5c917095ba527996933ce5acb86ed694cf016c7bc85b9712ee416f3bd3"} Dec 16 07:15:02 crc kubenswrapper[4789]: I1216 07:15:02.542271 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 07:15:02 crc kubenswrapper[4789]: I1216 07:15:02.545880 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 07:15:02 crc kubenswrapper[4789]: I1216 07:15:02.549346 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.042822 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.439967 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.549878 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e7e593c-0688-4ff7-b959-37f36a74aa2b-config-volume\") pod \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.550001 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztrps\" (UniqueName: \"kubernetes.io/projected/1e7e593c-0688-4ff7-b959-37f36a74aa2b-kube-api-access-ztrps\") pod \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.550129 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e7e593c-0688-4ff7-b959-37f36a74aa2b-secret-volume\") pod \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\" (UID: \"1e7e593c-0688-4ff7-b959-37f36a74aa2b\") " Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.551080 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e7e593c-0688-4ff7-b959-37f36a74aa2b-config-volume" (OuterVolumeSpecName: "config-volume") pod "1e7e593c-0688-4ff7-b959-37f36a74aa2b" (UID: "1e7e593c-0688-4ff7-b959-37f36a74aa2b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.558160 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7e593c-0688-4ff7-b959-37f36a74aa2b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1e7e593c-0688-4ff7-b959-37f36a74aa2b" (UID: "1e7e593c-0688-4ff7-b959-37f36a74aa2b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.558244 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7e593c-0688-4ff7-b959-37f36a74aa2b-kube-api-access-ztrps" (OuterVolumeSpecName: "kube-api-access-ztrps") pod "1e7e593c-0688-4ff7-b959-37f36a74aa2b" (UID: "1e7e593c-0688-4ff7-b959-37f36a74aa2b"). InnerVolumeSpecName "kube-api-access-ztrps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.652030 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e7e593c-0688-4ff7-b959-37f36a74aa2b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.652070 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e7e593c-0688-4ff7-b959-37f36a74aa2b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:03 crc kubenswrapper[4789]: I1216 07:15:03.652080 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztrps\" (UniqueName: \"kubernetes.io/projected/1e7e593c-0688-4ff7-b959-37f36a74aa2b-kube-api-access-ztrps\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:04 crc kubenswrapper[4789]: I1216 07:15:04.049531 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" Dec 16 07:15:04 crc kubenswrapper[4789]: I1216 07:15:04.049538 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4" event={"ID":"1e7e593c-0688-4ff7-b959-37f36a74aa2b","Type":"ContainerDied","Data":"be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5"} Dec 16 07:15:04 crc kubenswrapper[4789]: I1216 07:15:04.049581 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5" Dec 16 07:15:04 crc kubenswrapper[4789]: E1216 07:15:04.874639 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7e593c_0688_4ff7_b959_37f36a74aa2b.slice/crio-be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5\": RecentStats: unable to find data in memory cache]" Dec 16 07:15:04 crc kubenswrapper[4789]: I1216 07:15:04.962185 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.060610 4789 generic.go:334] "Generic (PLEG): container finished" podID="cae1f599-5496-4a3f-8f82-b77d9923f9aa" containerID="03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a" exitCode=137 Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.060715 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.060694 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cae1f599-5496-4a3f-8f82-b77d9923f9aa","Type":"ContainerDied","Data":"03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a"} Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.060792 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cae1f599-5496-4a3f-8f82-b77d9923f9aa","Type":"ContainerDied","Data":"f2a72b260fb2346bd7692140de84b1b16d5ac168474da22e73e83620f5f00e2c"} Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.060818 4789 scope.go:117] "RemoveContainer" containerID="03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.082347 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-combined-ca-bundle\") pod \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.082397 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzhxw\" (UniqueName: \"kubernetes.io/projected/cae1f599-5496-4a3f-8f82-b77d9923f9aa-kube-api-access-xzhxw\") pod \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.082533 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-config-data\") pod \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\" (UID: \"cae1f599-5496-4a3f-8f82-b77d9923f9aa\") " Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.102852 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae1f599-5496-4a3f-8f82-b77d9923f9aa-kube-api-access-xzhxw" (OuterVolumeSpecName: "kube-api-access-xzhxw") pod "cae1f599-5496-4a3f-8f82-b77d9923f9aa" (UID: "cae1f599-5496-4a3f-8f82-b77d9923f9aa"). InnerVolumeSpecName "kube-api-access-xzhxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.109596 4789 scope.go:117] "RemoveContainer" containerID="03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a" Dec 16 07:15:05 crc kubenswrapper[4789]: E1216 07:15:05.112022 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a\": container with ID starting with 03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a not found: ID does not exist" containerID="03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.112070 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a"} err="failed to get container status \"03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a\": rpc error: code = NotFound desc = could not find container \"03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a\": container with ID starting with 03c5d6b2e96f59b97427d2300fc81e0c1e386d58ee98ad814dadb70f29375b9a not found: ID does not exist" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.118238 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cae1f599-5496-4a3f-8f82-b77d9923f9aa" (UID: "cae1f599-5496-4a3f-8f82-b77d9923f9aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.136177 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-config-data" (OuterVolumeSpecName: "config-data") pod "cae1f599-5496-4a3f-8f82-b77d9923f9aa" (UID: "cae1f599-5496-4a3f-8f82-b77d9923f9aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.186568 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.186611 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae1f599-5496-4a3f-8f82-b77d9923f9aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.186630 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzhxw\" (UniqueName: \"kubernetes.io/projected/cae1f599-5496-4a3f-8f82-b77d9923f9aa-kube-api-access-xzhxw\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.402050 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.411713 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.426468 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:15:05 crc kubenswrapper[4789]: E1216 07:15:05.426828 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7e593c-0688-4ff7-b959-37f36a74aa2b" containerName="collect-profiles" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.426849 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7e593c-0688-4ff7-b959-37f36a74aa2b" containerName="collect-profiles" Dec 16 07:15:05 crc kubenswrapper[4789]: E1216 07:15:05.426873 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae1f599-5496-4a3f-8f82-b77d9923f9aa" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.426880 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae1f599-5496-4a3f-8f82-b77d9923f9aa" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.427073 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7e593c-0688-4ff7-b959-37f36a74aa2b" containerName="collect-profiles" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.427086 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae1f599-5496-4a3f-8f82-b77d9923f9aa" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.427720 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.430434 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.430672 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.431018 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.449146 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.596587 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.596660 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.596769 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msn76\" (UniqueName: \"kubernetes.io/projected/00293d36-0c18-4d79-aacd-4224045ff895-kube-api-access-msn76\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.596973 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.597011 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.698698 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.698743 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.698772 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.698810 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.698872 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msn76\" (UniqueName: \"kubernetes.io/projected/00293d36-0c18-4d79-aacd-4224045ff895-kube-api-access-msn76\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.703360 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.703366 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.703624 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.704062 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.715057 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msn76\" (UniqueName: \"kubernetes.io/projected/00293d36-0c18-4d79-aacd-4224045ff895-kube-api-access-msn76\") pod \"nova-cell1-novncproxy-0\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.750474 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.959278 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.961973 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:15:05 crc kubenswrapper[4789]: I1216 07:15:05.994902 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.004995 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.070710 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.094630 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.121264 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae1f599-5496-4a3f-8f82-b77d9923f9aa" path="/var/lib/kubelet/pods/cae1f599-5496-4a3f-8f82-b77d9923f9aa/volumes" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.263991 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-mgsqq"] Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.266068 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.279122 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-mgsqq"] Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.351666 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.410579 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.410663 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.410787 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.410884 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-config\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.411037 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78m57\" (UniqueName: \"kubernetes.io/projected/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-kube-api-access-78m57\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.411063 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.512672 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-config\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.512772 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78m57\" (UniqueName: \"kubernetes.io/projected/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-kube-api-access-78m57\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.512805 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.512849 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.512881 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.512961 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.514495 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-config\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.514494 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-sb\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.515190 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.515259 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-swift-storage-0\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.515621 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-nb\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.533079 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78m57\" (UniqueName: \"kubernetes.io/projected/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-kube-api-access-78m57\") pod \"dnsmasq-dns-fcd6f8f8f-mgsqq\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:06 crc kubenswrapper[4789]: I1216 07:15:06.604989 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:07 crc kubenswrapper[4789]: I1216 07:15:07.093605 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00293d36-0c18-4d79-aacd-4224045ff895","Type":"ContainerStarted","Data":"702d34f2d07c6bd63fc7dbcd22cbe07825238be81729f58170d0b20b26fc3bf9"} Dec 16 07:15:07 crc kubenswrapper[4789]: I1216 07:15:07.093980 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00293d36-0c18-4d79-aacd-4224045ff895","Type":"ContainerStarted","Data":"a46dbca22a6efeef6310c97c3fda17650a2c7dc6c2a22977565a3c3b44654984"} Dec 16 07:15:07 crc kubenswrapper[4789]: I1216 07:15:07.094637 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-mgsqq"] Dec 16 07:15:07 crc kubenswrapper[4789]: I1216 07:15:07.118253 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.118234127 podStartE2EDuration="2.118234127s" podCreationTimestamp="2025-12-16 07:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:07.116370072 +0000 UTC m=+1445.378257701" watchObservedRunningTime="2025-12-16 07:15:07.118234127 +0000 UTC m=+1445.380121766" Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.117966 4789 generic.go:334] "Generic (PLEG): container finished" podID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerID="c3e023a9afe9c5691200c93fa3115544e6576e412ec57a86b25b5ebaba43ebb1" exitCode=0 Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.121704 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" event={"ID":"0ff4de9f-c7d4-4d77-81be-7a499ead0f10","Type":"ContainerDied","Data":"c3e023a9afe9c5691200c93fa3115544e6576e412ec57a86b25b5ebaba43ebb1"} Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.121749 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" event={"ID":"0ff4de9f-c7d4-4d77-81be-7a499ead0f10","Type":"ContainerStarted","Data":"9611a31b61e3c622dafebea51d27dbf4d41aaed7f4df133158e79d4bda05d9fd"} Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.612901 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.613582 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-central-agent" containerID="cri-o://366c79e647cc93ab3227a5511aadcde1f210df15532f4364cc3433bd27f092b8" gracePeriod=30 Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.613654 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="sg-core" containerID="cri-o://95543c0eff0750dd88b8a74582bf85ba80cd3c8e04639c50723adcf7a8277b01" gracePeriod=30 Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.613681 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-notification-agent" containerID="cri-o://2dcf1d180d144918b38fe62db9478f8a57310e1c94553aa1c570bdf87fa3bbea" gracePeriod=30 Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.613657 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="proxy-httpd" containerID="cri-o://8c0b14cdebc66dcba62089b3051a5cea447ed0938b79bbaf1a2427dd421c8556" gracePeriod=30 Dec 16 07:15:08 crc kubenswrapper[4789]: I1216 07:15:08.623152 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.195:3000/\": read tcp 10.217.0.2:34472->10.217.0.195:3000: read: connection reset by peer" Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.131032 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" event={"ID":"0ff4de9f-c7d4-4d77-81be-7a499ead0f10","Type":"ContainerStarted","Data":"1592998b94c2a7232306fe54bf6cc98c4eff08132169b26bf5e601d8973f4fa0"} Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.131148 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.137209 4789 generic.go:334] "Generic (PLEG): container finished" podID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerID="95543c0eff0750dd88b8a74582bf85ba80cd3c8e04639c50723adcf7a8277b01" exitCode=2 Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.137259 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerDied","Data":"95543c0eff0750dd88b8a74582bf85ba80cd3c8e04639c50723adcf7a8277b01"} Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.155870 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" podStartSLOduration=3.155853523 podStartE2EDuration="3.155853523s" podCreationTimestamp="2025-12-16 07:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:09.153814303 +0000 UTC m=+1447.415701932" watchObservedRunningTime="2025-12-16 07:15:09.155853523 +0000 UTC m=+1447.417741152" Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.761481 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.762051 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-log" containerID="cri-o://50258d470728eaf8d2b1970821c04e921998921d29930dc25c2327749b201d10" gracePeriod=30 Dec 16 07:15:09 crc kubenswrapper[4789]: I1216 07:15:09.762136 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-api" containerID="cri-o://ac60ab81a6fe548c21807ff1f2640ee95404ec9cad922f3315427856a38dd0ed" gracePeriod=30 Dec 16 07:15:10 crc kubenswrapper[4789]: I1216 07:15:10.152391 4789 generic.go:334] "Generic (PLEG): container finished" podID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerID="8c0b14cdebc66dcba62089b3051a5cea447ed0938b79bbaf1a2427dd421c8556" exitCode=0 Dec 16 07:15:10 crc kubenswrapper[4789]: I1216 07:15:10.152418 4789 generic.go:334] "Generic (PLEG): container finished" podID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerID="366c79e647cc93ab3227a5511aadcde1f210df15532f4364cc3433bd27f092b8" exitCode=0 Dec 16 07:15:10 crc kubenswrapper[4789]: I1216 07:15:10.152466 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerDied","Data":"8c0b14cdebc66dcba62089b3051a5cea447ed0938b79bbaf1a2427dd421c8556"} Dec 16 07:15:10 crc kubenswrapper[4789]: I1216 07:15:10.152524 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerDied","Data":"366c79e647cc93ab3227a5511aadcde1f210df15532f4364cc3433bd27f092b8"} Dec 16 07:15:10 crc kubenswrapper[4789]: I1216 07:15:10.155831 4789 generic.go:334] "Generic (PLEG): container finished" podID="292b2d90-e135-4b61-b939-b88c516cce30" containerID="50258d470728eaf8d2b1970821c04e921998921d29930dc25c2327749b201d10" exitCode=143 Dec 16 07:15:10 crc kubenswrapper[4789]: I1216 07:15:10.155903 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"292b2d90-e135-4b61-b939-b88c516cce30","Type":"ContainerDied","Data":"50258d470728eaf8d2b1970821c04e921998921d29930dc25c2327749b201d10"} Dec 16 07:15:10 crc kubenswrapper[4789]: I1216 07:15:10.750661 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.201411 4789 generic.go:334] "Generic (PLEG): container finished" podID="292b2d90-e135-4b61-b939-b88c516cce30" containerID="ac60ab81a6fe548c21807ff1f2640ee95404ec9cad922f3315427856a38dd0ed" exitCode=0 Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.201486 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"292b2d90-e135-4b61-b939-b88c516cce30","Type":"ContainerDied","Data":"ac60ab81a6fe548c21807ff1f2640ee95404ec9cad922f3315427856a38dd0ed"} Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.351109 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.457641 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c4k2\" (UniqueName: \"kubernetes.io/projected/292b2d90-e135-4b61-b939-b88c516cce30-kube-api-access-9c4k2\") pod \"292b2d90-e135-4b61-b939-b88c516cce30\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.458070 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-config-data\") pod \"292b2d90-e135-4b61-b939-b88c516cce30\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.458154 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/292b2d90-e135-4b61-b939-b88c516cce30-logs\") pod \"292b2d90-e135-4b61-b939-b88c516cce30\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.458181 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-combined-ca-bundle\") pod \"292b2d90-e135-4b61-b939-b88c516cce30\" (UID: \"292b2d90-e135-4b61-b939-b88c516cce30\") " Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.458668 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/292b2d90-e135-4b61-b939-b88c516cce30-logs" (OuterVolumeSpecName: "logs") pod "292b2d90-e135-4b61-b939-b88c516cce30" (UID: "292b2d90-e135-4b61-b939-b88c516cce30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.458965 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/292b2d90-e135-4b61-b939-b88c516cce30-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.468172 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292b2d90-e135-4b61-b939-b88c516cce30-kube-api-access-9c4k2" (OuterVolumeSpecName: "kube-api-access-9c4k2") pod "292b2d90-e135-4b61-b939-b88c516cce30" (UID: "292b2d90-e135-4b61-b939-b88c516cce30"). InnerVolumeSpecName "kube-api-access-9c4k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.485096 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-config-data" (OuterVolumeSpecName: "config-data") pod "292b2d90-e135-4b61-b939-b88c516cce30" (UID: "292b2d90-e135-4b61-b939-b88c516cce30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.500096 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "292b2d90-e135-4b61-b939-b88c516cce30" (UID: "292b2d90-e135-4b61-b939-b88c516cce30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.560739 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c4k2\" (UniqueName: \"kubernetes.io/projected/292b2d90-e135-4b61-b939-b88c516cce30-kube-api-access-9c4k2\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.560787 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:13 crc kubenswrapper[4789]: I1216 07:15:13.560801 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b2d90-e135-4b61-b939-b88c516cce30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.217105 4789 generic.go:334] "Generic (PLEG): container finished" podID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerID="2dcf1d180d144918b38fe62db9478f8a57310e1c94553aa1c570bdf87fa3bbea" exitCode=0 Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.217166 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerDied","Data":"2dcf1d180d144918b38fe62db9478f8a57310e1c94553aa1c570bdf87fa3bbea"} Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.222661 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"292b2d90-e135-4b61-b939-b88c516cce30","Type":"ContainerDied","Data":"c90f89db89b3a1cef819a5cb57aebb47a730c1476349f6ba133f07ff0303e0c8"} Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.222713 4789 scope.go:117] "RemoveContainer" containerID="ac60ab81a6fe548c21807ff1f2640ee95404ec9cad922f3315427856a38dd0ed" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.222851 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.316087 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.322387 4789 scope.go:117] "RemoveContainer" containerID="50258d470728eaf8d2b1970821c04e921998921d29930dc25c2327749b201d10" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.332887 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.342993 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.390395 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:14 crc kubenswrapper[4789]: E1216 07:15:14.390850 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-log" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.390870 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-log" Dec 16 07:15:14 crc kubenswrapper[4789]: E1216 07:15:14.390883 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="proxy-httpd" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.390890 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="proxy-httpd" Dec 16 07:15:14 crc kubenswrapper[4789]: E1216 07:15:14.390992 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="sg-core" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391001 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="sg-core" Dec 16 07:15:14 crc kubenswrapper[4789]: E1216 07:15:14.391013 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-central-agent" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391020 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-central-agent" Dec 16 07:15:14 crc kubenswrapper[4789]: E1216 07:15:14.391041 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-api" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391049 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-api" Dec 16 07:15:14 crc kubenswrapper[4789]: E1216 07:15:14.391060 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-notification-agent" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391067 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-notification-agent" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391268 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-log" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391285 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="sg-core" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391299 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="proxy-httpd" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391312 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-central-agent" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391324 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" containerName="ceilometer-notification-agent" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.391337 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="292b2d90-e135-4b61-b939-b88c516cce30" containerName="nova-api-api" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.392393 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.395230 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.395436 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.395550 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.414195 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.476581 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-scripts\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.476753 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-run-httpd\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.476797 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnxb9\" (UniqueName: \"kubernetes.io/projected/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-kube-api-access-xnxb9\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.476872 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-log-httpd\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.476907 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-combined-ca-bundle\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.476956 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-config-data\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.476980 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-ceilometer-tls-certs\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.477033 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-sg-core-conf-yaml\") pod \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\" (UID: \"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88\") " Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.477265 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-config-data\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.477390 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0779534-cd98-458a-83d8-df38579eb250-logs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.477438 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk2wz\" (UniqueName: \"kubernetes.io/projected/a0779534-cd98-458a-83d8-df38579eb250-kube-api-access-sk2wz\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.477474 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.477517 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.477545 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.478409 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.478777 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.483479 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-scripts" (OuterVolumeSpecName: "scripts") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.488531 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-kube-api-access-xnxb9" (OuterVolumeSpecName: "kube-api-access-xnxb9") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "kube-api-access-xnxb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.525239 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.557120 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580275 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580353 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580401 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-config-data\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580509 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0779534-cd98-458a-83d8-df38579eb250-logs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580562 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk2wz\" (UniqueName: \"kubernetes.io/projected/a0779534-cd98-458a-83d8-df38579eb250-kube-api-access-sk2wz\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580595 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580687 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580698 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580707 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnxb9\" (UniqueName: \"kubernetes.io/projected/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-kube-api-access-xnxb9\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580720 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580729 4789 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.580738 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.581599 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.581699 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0779534-cd98-458a-83d8-df38579eb250-logs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.584987 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.585535 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.586197 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-config-data\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.588771 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.593633 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-config-data" (OuterVolumeSpecName: "config-data") pod "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" (UID: "bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.598407 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk2wz\" (UniqueName: \"kubernetes.io/projected/a0779534-cd98-458a-83d8-df38579eb250-kube-api-access-sk2wz\") pod \"nova-api-0\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " pod="openstack/nova-api-0" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.682499 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.682531 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:14 crc kubenswrapper[4789]: I1216 07:15:14.718663 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:15 crc kubenswrapper[4789]: E1216 07:15:15.148174 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7e593c_0688_4ff7_b959_37f36a74aa2b.slice/crio-be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5\": RecentStats: unable to find data in memory cache]" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.150297 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.232858 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.232890 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88","Type":"ContainerDied","Data":"ebf04e8ef82da65397e9663cefea46b51f955ec389cbcdfeedf9af9aadec4262"} Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.233317 4789 scope.go:117] "RemoveContainer" containerID="8c0b14cdebc66dcba62089b3051a5cea447ed0938b79bbaf1a2427dd421c8556" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.234880 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0779534-cd98-458a-83d8-df38579eb250","Type":"ContainerStarted","Data":"334eca73992cf1f957dbbb1d934a2388845abccf75b899414c0d554124b28ff1"} Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.273996 4789 scope.go:117] "RemoveContainer" containerID="95543c0eff0750dd88b8a74582bf85ba80cd3c8e04639c50723adcf7a8277b01" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.314876 4789 scope.go:117] "RemoveContainer" containerID="2dcf1d180d144918b38fe62db9478f8a57310e1c94553aa1c570bdf87fa3bbea" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.315943 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.333964 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.344490 4789 scope.go:117] "RemoveContainer" containerID="366c79e647cc93ab3227a5511aadcde1f210df15532f4364cc3433bd27f092b8" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.366359 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.368933 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.372780 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.373097 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.372832 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.377158 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506122 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-config-data\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506287 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506365 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-scripts\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506450 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-run-httpd\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506592 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-log-httpd\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506649 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bwp7\" (UniqueName: \"kubernetes.io/projected/1894718e-3dac-4430-9285-e397fb21e852-kube-api-access-6bwp7\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506695 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.506756 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608238 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608306 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-config-data\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608350 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608378 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-scripts\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608426 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-run-httpd\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608492 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-log-httpd\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608540 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bwp7\" (UniqueName: \"kubernetes.io/projected/1894718e-3dac-4430-9285-e397fb21e852-kube-api-access-6bwp7\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.608581 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.609230 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-run-httpd\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.609512 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-log-httpd\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.612985 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.613635 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.615880 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.616812 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-config-data\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.627029 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-scripts\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.630926 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bwp7\" (UniqueName: \"kubernetes.io/projected/1894718e-3dac-4430-9285-e397fb21e852-kube-api-access-6bwp7\") pod \"ceilometer-0\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.687365 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.751383 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:15 crc kubenswrapper[4789]: I1216 07:15:15.770291 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.120079 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292b2d90-e135-4b61-b939-b88c516cce30" path="/var/lib/kubelet/pods/292b2d90-e135-4b61-b939-b88c516cce30/volumes" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.120796 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88" path="/var/lib/kubelet/pods/bec26ed2-1e5e-4b82-9e58-3b87cfd9bf88/volumes" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.188986 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:15:16 crc kubenswrapper[4789]: W1216 07:15:16.201803 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1894718e_3dac_4430_9285_e397fb21e852.slice/crio-068789b909c49a0acec2052a1433774096d7594593fe4299026c45169c555698 WatchSource:0}: Error finding container 068789b909c49a0acec2052a1433774096d7594593fe4299026c45169c555698: Status 404 returned error can't find the container with id 068789b909c49a0acec2052a1433774096d7594593fe4299026c45169c555698 Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.204075 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.248572 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0779534-cd98-458a-83d8-df38579eb250","Type":"ContainerStarted","Data":"df2a294286c1b8afd2a5ef7eccb689de271e71f689ca17119dae658fad791112"} Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.248624 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0779534-cd98-458a-83d8-df38579eb250","Type":"ContainerStarted","Data":"49c5368cc9ee172d9613be1998a8fcea541b7c1d603b5d95de3280ea5a8a2977"} Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.249452 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerStarted","Data":"068789b909c49a0acec2052a1433774096d7594593fe4299026c45169c555698"} Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.273545 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.27352531 podStartE2EDuration="2.27352531s" podCreationTimestamp="2025-12-16 07:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:16.265718251 +0000 UTC m=+1454.527605890" watchObservedRunningTime="2025-12-16 07:15:16.27352531 +0000 UTC m=+1454.535412929" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.282251 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.456929 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dlrbk"] Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.458175 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.461824 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.468800 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlrbk"] Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.492233 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.607084 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.638355 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.638406 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr66c\" (UniqueName: \"kubernetes.io/projected/c0a1b02c-34b2-4955-800f-e21970da98d9-kube-api-access-kr66c\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.638502 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-scripts\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.638542 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-config-data\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.677967 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-4g6jl"] Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.678243 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" podUID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerName="dnsmasq-dns" containerID="cri-o://a9d3e2d0717a7e9587d61cdaf207fe23e29b7bf0ae3ee2517d87efb7bc8817af" gracePeriod=10 Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.740892 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.740968 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr66c\" (UniqueName: \"kubernetes.io/projected/c0a1b02c-34b2-4955-800f-e21970da98d9-kube-api-access-kr66c\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.740995 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-config-data\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.741014 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-scripts\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.756448 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-scripts\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.756491 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.757212 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-config-data\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.777520 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr66c\" (UniqueName: \"kubernetes.io/projected/c0a1b02c-34b2-4955-800f-e21970da98d9-kube-api-access-kr66c\") pod \"nova-cell1-cell-mapping-dlrbk\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:16 crc kubenswrapper[4789]: I1216 07:15:16.791220 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.278196 4789 generic.go:334] "Generic (PLEG): container finished" podID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerID="a9d3e2d0717a7e9587d61cdaf207fe23e29b7bf0ae3ee2517d87efb7bc8817af" exitCode=0 Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.278345 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" event={"ID":"d2f16298-806a-4dfb-a320-96f52dfeeb6e","Type":"ContainerDied","Data":"a9d3e2d0717a7e9587d61cdaf207fe23e29b7bf0ae3ee2517d87efb7bc8817af"} Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.515707 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.644845 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlrbk"] Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.670190 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-nb\") pod \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.670259 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-swift-storage-0\") pod \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.670286 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-config\") pod \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.670436 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-svc\") pod \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.670519 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-sb\") pod \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.670577 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl7xv\" (UniqueName: \"kubernetes.io/projected/d2f16298-806a-4dfb-a320-96f52dfeeb6e-kube-api-access-vl7xv\") pod \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\" (UID: \"d2f16298-806a-4dfb-a320-96f52dfeeb6e\") " Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.680211 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f16298-806a-4dfb-a320-96f52dfeeb6e-kube-api-access-vl7xv" (OuterVolumeSpecName: "kube-api-access-vl7xv") pod "d2f16298-806a-4dfb-a320-96f52dfeeb6e" (UID: "d2f16298-806a-4dfb-a320-96f52dfeeb6e"). InnerVolumeSpecName "kube-api-access-vl7xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.731374 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2f16298-806a-4dfb-a320-96f52dfeeb6e" (UID: "d2f16298-806a-4dfb-a320-96f52dfeeb6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.738126 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2f16298-806a-4dfb-a320-96f52dfeeb6e" (UID: "d2f16298-806a-4dfb-a320-96f52dfeeb6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.746449 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2f16298-806a-4dfb-a320-96f52dfeeb6e" (UID: "d2f16298-806a-4dfb-a320-96f52dfeeb6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.755520 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-config" (OuterVolumeSpecName: "config") pod "d2f16298-806a-4dfb-a320-96f52dfeeb6e" (UID: "d2f16298-806a-4dfb-a320-96f52dfeeb6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.755720 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2f16298-806a-4dfb-a320-96f52dfeeb6e" (UID: "d2f16298-806a-4dfb-a320-96f52dfeeb6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.774322 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.774403 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl7xv\" (UniqueName: \"kubernetes.io/projected/d2f16298-806a-4dfb-a320-96f52dfeeb6e-kube-api-access-vl7xv\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.774417 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.774427 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.774439 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:17 crc kubenswrapper[4789]: I1216 07:15:17.774449 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2f16298-806a-4dfb-a320-96f52dfeeb6e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.313904 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" event={"ID":"d2f16298-806a-4dfb-a320-96f52dfeeb6e","Type":"ContainerDied","Data":"750543b9a7a7a78602f9f5882aacc35ea7e73948b7cc7771b456a60032b284fe"} Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.314324 4789 scope.go:117] "RemoveContainer" containerID="a9d3e2d0717a7e9587d61cdaf207fe23e29b7bf0ae3ee2517d87efb7bc8817af" Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.314539 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647df7b8c5-4g6jl" Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.318009 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlrbk" event={"ID":"c0a1b02c-34b2-4955-800f-e21970da98d9","Type":"ContainerStarted","Data":"94400587bfb136f7e982ac2bdabf638fd15508e132e3af50147661d7ccefa7f1"} Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.318048 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlrbk" event={"ID":"c0a1b02c-34b2-4955-800f-e21970da98d9","Type":"ContainerStarted","Data":"9503e1f19773949572d7f35da9938995adcbb2aa40bb94bb59a8cf4dc2de699c"} Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.327500 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerStarted","Data":"9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4"} Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.338806 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dlrbk" podStartSLOduration=2.338784697 podStartE2EDuration="2.338784697s" podCreationTimestamp="2025-12-16 07:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:18.335440566 +0000 UTC m=+1456.597328205" watchObservedRunningTime="2025-12-16 07:15:18.338784697 +0000 UTC m=+1456.600672326" Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.343896 4789 scope.go:117] "RemoveContainer" containerID="cbfb2e0ba2710519517623fa41df119a7254c846ae45bffd7f3247161de26dbd" Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.379977 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-4g6jl"] Dec 16 07:15:18 crc kubenswrapper[4789]: I1216 07:15:18.399510 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-647df7b8c5-4g6jl"] Dec 16 07:15:19 crc kubenswrapper[4789]: I1216 07:15:19.336202 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerStarted","Data":"c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6"} Dec 16 07:15:19 crc kubenswrapper[4789]: I1216 07:15:19.337014 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerStarted","Data":"02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0"} Dec 16 07:15:20 crc kubenswrapper[4789]: I1216 07:15:20.116202 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" path="/var/lib/kubelet/pods/d2f16298-806a-4dfb-a320-96f52dfeeb6e/volumes" Dec 16 07:15:21 crc kubenswrapper[4789]: I1216 07:15:21.356517 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerStarted","Data":"231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331"} Dec 16 07:15:21 crc kubenswrapper[4789]: I1216 07:15:21.358082 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 07:15:21 crc kubenswrapper[4789]: I1216 07:15:21.382071 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.044843242 podStartE2EDuration="6.382047915s" podCreationTimestamp="2025-12-16 07:15:15 +0000 UTC" firstStartedPulling="2025-12-16 07:15:16.203773353 +0000 UTC m=+1454.465660992" lastFinishedPulling="2025-12-16 07:15:20.540978036 +0000 UTC m=+1458.802865665" observedRunningTime="2025-12-16 07:15:21.377540456 +0000 UTC m=+1459.639428095" watchObservedRunningTime="2025-12-16 07:15:21.382047915 +0000 UTC m=+1459.643935544" Dec 16 07:15:21 crc kubenswrapper[4789]: I1216 07:15:21.927597 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:15:21 crc kubenswrapper[4789]: I1216 07:15:21.927650 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:15:24 crc kubenswrapper[4789]: I1216 07:15:24.382725 4789 generic.go:334] "Generic (PLEG): container finished" podID="c0a1b02c-34b2-4955-800f-e21970da98d9" containerID="94400587bfb136f7e982ac2bdabf638fd15508e132e3af50147661d7ccefa7f1" exitCode=0 Dec 16 07:15:24 crc kubenswrapper[4789]: I1216 07:15:24.382857 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlrbk" event={"ID":"c0a1b02c-34b2-4955-800f-e21970da98d9","Type":"ContainerDied","Data":"94400587bfb136f7e982ac2bdabf638fd15508e132e3af50147661d7ccefa7f1"} Dec 16 07:15:24 crc kubenswrapper[4789]: I1216 07:15:24.719047 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:15:24 crc kubenswrapper[4789]: I1216 07:15:24.719363 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:15:25 crc kubenswrapper[4789]: E1216 07:15:25.453443 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7e593c_0688_4ff7_b959_37f36a74aa2b.slice/crio-be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5\": RecentStats: unable to find data in memory cache]" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.735269 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.735549 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.780731 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.845209 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-scripts\") pod \"c0a1b02c-34b2-4955-800f-e21970da98d9\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.845275 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-config-data\") pod \"c0a1b02c-34b2-4955-800f-e21970da98d9\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.845517 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-combined-ca-bundle\") pod \"c0a1b02c-34b2-4955-800f-e21970da98d9\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.845585 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr66c\" (UniqueName: \"kubernetes.io/projected/c0a1b02c-34b2-4955-800f-e21970da98d9-kube-api-access-kr66c\") pod \"c0a1b02c-34b2-4955-800f-e21970da98d9\" (UID: \"c0a1b02c-34b2-4955-800f-e21970da98d9\") " Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.852106 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-scripts" (OuterVolumeSpecName: "scripts") pod "c0a1b02c-34b2-4955-800f-e21970da98d9" (UID: "c0a1b02c-34b2-4955-800f-e21970da98d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.852581 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a1b02c-34b2-4955-800f-e21970da98d9-kube-api-access-kr66c" (OuterVolumeSpecName: "kube-api-access-kr66c") pod "c0a1b02c-34b2-4955-800f-e21970da98d9" (UID: "c0a1b02c-34b2-4955-800f-e21970da98d9"). InnerVolumeSpecName "kube-api-access-kr66c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.884060 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0a1b02c-34b2-4955-800f-e21970da98d9" (UID: "c0a1b02c-34b2-4955-800f-e21970da98d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.893034 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-config-data" (OuterVolumeSpecName: "config-data") pod "c0a1b02c-34b2-4955-800f-e21970da98d9" (UID: "c0a1b02c-34b2-4955-800f-e21970da98d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.948081 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.948124 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.948140 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr66c\" (UniqueName: \"kubernetes.io/projected/c0a1b02c-34b2-4955-800f-e21970da98d9-kube-api-access-kr66c\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:25 crc kubenswrapper[4789]: I1216 07:15:25.948151 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a1b02c-34b2-4955-800f-e21970da98d9-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.410652 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dlrbk" event={"ID":"c0a1b02c-34b2-4955-800f-e21970da98d9","Type":"ContainerDied","Data":"9503e1f19773949572d7f35da9938995adcbb2aa40bb94bb59a8cf4dc2de699c"} Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.411247 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9503e1f19773949572d7f35da9938995adcbb2aa40bb94bb59a8cf4dc2de699c" Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.410741 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dlrbk" Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.599131 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.599396 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="00cf1f8f-f55c-445b-9338-b09efa1be344" containerName="nova-scheduler-scheduler" containerID="cri-o://3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee" gracePeriod=30 Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.618614 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.618981 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-log" containerID="cri-o://49c5368cc9ee172d9613be1998a8fcea541b7c1d603b5d95de3280ea5a8a2977" gracePeriod=30 Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.619637 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-api" containerID="cri-o://df2a294286c1b8afd2a5ef7eccb689de271e71f689ca17119dae658fad791112" gracePeriod=30 Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.629359 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.629780 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-log" containerID="cri-o://1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c" gracePeriod=30 Dec 16 07:15:26 crc kubenswrapper[4789]: I1216 07:15:26.629843 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-metadata" containerID="cri-o://a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5" gracePeriod=30 Dec 16 07:15:27 crc kubenswrapper[4789]: I1216 07:15:27.419999 4789 generic.go:334] "Generic (PLEG): container finished" podID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerID="1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c" exitCode=143 Dec 16 07:15:27 crc kubenswrapper[4789]: I1216 07:15:27.420055 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01cde849-67ae-4c7f-b288-04aa02b02fc9","Type":"ContainerDied","Data":"1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c"} Dec 16 07:15:27 crc kubenswrapper[4789]: I1216 07:15:27.422555 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0779534-cd98-458a-83d8-df38579eb250" containerID="49c5368cc9ee172d9613be1998a8fcea541b7c1d603b5d95de3280ea5a8a2977" exitCode=143 Dec 16 07:15:27 crc kubenswrapper[4789]: I1216 07:15:27.422581 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0779534-cd98-458a-83d8-df38579eb250","Type":"ContainerDied","Data":"49c5368cc9ee172d9613be1998a8fcea541b7c1d603b5d95de3280ea5a8a2977"} Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.108767 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.210964 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-config-data\") pod \"00cf1f8f-f55c-445b-9338-b09efa1be344\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.211043 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-combined-ca-bundle\") pod \"00cf1f8f-f55c-445b-9338-b09efa1be344\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.211141 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4kdh\" (UniqueName: \"kubernetes.io/projected/00cf1f8f-f55c-445b-9338-b09efa1be344-kube-api-access-z4kdh\") pod \"00cf1f8f-f55c-445b-9338-b09efa1be344\" (UID: \"00cf1f8f-f55c-445b-9338-b09efa1be344\") " Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.220338 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00cf1f8f-f55c-445b-9338-b09efa1be344-kube-api-access-z4kdh" (OuterVolumeSpecName: "kube-api-access-z4kdh") pod "00cf1f8f-f55c-445b-9338-b09efa1be344" (UID: "00cf1f8f-f55c-445b-9338-b09efa1be344"). InnerVolumeSpecName "kube-api-access-z4kdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.240394 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00cf1f8f-f55c-445b-9338-b09efa1be344" (UID: "00cf1f8f-f55c-445b-9338-b09efa1be344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.250942 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-config-data" (OuterVolumeSpecName: "config-data") pod "00cf1f8f-f55c-445b-9338-b09efa1be344" (UID: "00cf1f8f-f55c-445b-9338-b09efa1be344"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.313634 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.313669 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cf1f8f-f55c-445b-9338-b09efa1be344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.313679 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4kdh\" (UniqueName: \"kubernetes.io/projected/00cf1f8f-f55c-445b-9338-b09efa1be344-kube-api-access-z4kdh\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.440994 4789 generic.go:334] "Generic (PLEG): container finished" podID="00cf1f8f-f55c-445b-9338-b09efa1be344" containerID="3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee" exitCode=0 Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.441052 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.441376 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"00cf1f8f-f55c-445b-9338-b09efa1be344","Type":"ContainerDied","Data":"3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee"} Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.441511 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"00cf1f8f-f55c-445b-9338-b09efa1be344","Type":"ContainerDied","Data":"54e678e4bf9b70b66c57acc736c084738a526394b2b941fb10a04b2874f10834"} Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.441577 4789 scope.go:117] "RemoveContainer" containerID="3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.473966 4789 scope.go:117] "RemoveContainer" containerID="3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee" Dec 16 07:15:29 crc kubenswrapper[4789]: E1216 07:15:29.474855 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee\": container with ID starting with 3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee not found: ID does not exist" containerID="3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.474901 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee"} err="failed to get container status \"3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee\": rpc error: code = NotFound desc = could not find container \"3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee\": container with ID starting with 3c30c5ec595d1e55b2ffc41627294c44760ee3af3f7cd025c33616ee4992fbee not found: ID does not exist" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.480372 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.496398 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.509134 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:15:29 crc kubenswrapper[4789]: E1216 07:15:29.509662 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a1b02c-34b2-4955-800f-e21970da98d9" containerName="nova-manage" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.509686 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a1b02c-34b2-4955-800f-e21970da98d9" containerName="nova-manage" Dec 16 07:15:29 crc kubenswrapper[4789]: E1216 07:15:29.509711 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerName="init" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.509721 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerName="init" Dec 16 07:15:29 crc kubenswrapper[4789]: E1216 07:15:29.509744 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00cf1f8f-f55c-445b-9338-b09efa1be344" containerName="nova-scheduler-scheduler" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.509753 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="00cf1f8f-f55c-445b-9338-b09efa1be344" containerName="nova-scheduler-scheduler" Dec 16 07:15:29 crc kubenswrapper[4789]: E1216 07:15:29.509770 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerName="dnsmasq-dns" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.509778 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerName="dnsmasq-dns" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.510003 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a1b02c-34b2-4955-800f-e21970da98d9" containerName="nova-manage" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.510045 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="00cf1f8f-f55c-445b-9338-b09efa1be344" containerName="nova-scheduler-scheduler" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.510057 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f16298-806a-4dfb-a320-96f52dfeeb6e" containerName="dnsmasq-dns" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.511136 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.513432 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.517990 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.622552 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp5w5\" (UniqueName: \"kubernetes.io/projected/8680ae27-3e72-416b-9983-9b195fedcefb-kube-api-access-tp5w5\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.622882 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-config-data\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.623030 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.724720 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp5w5\" (UniqueName: \"kubernetes.io/projected/8680ae27-3e72-416b-9983-9b195fedcefb-kube-api-access-tp5w5\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.724810 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-config-data\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.724868 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.730462 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.732504 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-config-data\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.741562 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp5w5\" (UniqueName: \"kubernetes.io/projected/8680ae27-3e72-416b-9983-9b195fedcefb-kube-api-access-tp5w5\") pod \"nova-scheduler-0\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " pod="openstack/nova-scheduler-0" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.760683 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:48364->10.217.0.190:8775: read: connection reset by peer" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.760702 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:48352->10.217.0.190:8775: read: connection reset by peer" Dec 16 07:15:29 crc kubenswrapper[4789]: I1216 07:15:29.832549 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:15:30 crc kubenswrapper[4789]: I1216 07:15:30.121723 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00cf1f8f-f55c-445b-9338-b09efa1be344" path="/var/lib/kubelet/pods/00cf1f8f-f55c-445b-9338-b09efa1be344/volumes" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.255208 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.336495 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-combined-ca-bundle\") pod \"01cde849-67ae-4c7f-b288-04aa02b02fc9\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.336640 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01cde849-67ae-4c7f-b288-04aa02b02fc9-logs\") pod \"01cde849-67ae-4c7f-b288-04aa02b02fc9\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.336969 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-nova-metadata-tls-certs\") pod \"01cde849-67ae-4c7f-b288-04aa02b02fc9\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.337028 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-config-data\") pod \"01cde849-67ae-4c7f-b288-04aa02b02fc9\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.337113 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zwlx\" (UniqueName: \"kubernetes.io/projected/01cde849-67ae-4c7f-b288-04aa02b02fc9-kube-api-access-4zwlx\") pod \"01cde849-67ae-4c7f-b288-04aa02b02fc9\" (UID: \"01cde849-67ae-4c7f-b288-04aa02b02fc9\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.337403 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01cde849-67ae-4c7f-b288-04aa02b02fc9-logs" (OuterVolumeSpecName: "logs") pod "01cde849-67ae-4c7f-b288-04aa02b02fc9" (UID: "01cde849-67ae-4c7f-b288-04aa02b02fc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.338316 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01cde849-67ae-4c7f-b288-04aa02b02fc9-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.356313 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01cde849-67ae-4c7f-b288-04aa02b02fc9-kube-api-access-4zwlx" (OuterVolumeSpecName: "kube-api-access-4zwlx") pod "01cde849-67ae-4c7f-b288-04aa02b02fc9" (UID: "01cde849-67ae-4c7f-b288-04aa02b02fc9"). InnerVolumeSpecName "kube-api-access-4zwlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: W1216 07:15:30.373272 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8680ae27_3e72_416b_9983_9b195fedcefb.slice/crio-7572bea88ddbb6452e0666949ad98511513c23c8c2ce5bf46cd5129000010ee4 WatchSource:0}: Error finding container 7572bea88ddbb6452e0666949ad98511513c23c8c2ce5bf46cd5129000010ee4: Status 404 returned error can't find the container with id 7572bea88ddbb6452e0666949ad98511513c23c8c2ce5bf46cd5129000010ee4 Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.374473 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.376775 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-config-data" (OuterVolumeSpecName: "config-data") pod "01cde849-67ae-4c7f-b288-04aa02b02fc9" (UID: "01cde849-67ae-4c7f-b288-04aa02b02fc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.388191 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01cde849-67ae-4c7f-b288-04aa02b02fc9" (UID: "01cde849-67ae-4c7f-b288-04aa02b02fc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.398953 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "01cde849-67ae-4c7f-b288-04aa02b02fc9" (UID: "01cde849-67ae-4c7f-b288-04aa02b02fc9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.439717 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.439746 4789 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.439757 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cde849-67ae-4c7f-b288-04aa02b02fc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.439766 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zwlx\" (UniqueName: \"kubernetes.io/projected/01cde849-67ae-4c7f-b288-04aa02b02fc9-kube-api-access-4zwlx\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.456504 4789 generic.go:334] "Generic (PLEG): container finished" podID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerID="a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5" exitCode=0 Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.456551 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01cde849-67ae-4c7f-b288-04aa02b02fc9","Type":"ContainerDied","Data":"a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5"} Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.456580 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01cde849-67ae-4c7f-b288-04aa02b02fc9","Type":"ContainerDied","Data":"52caaac66d6d6a042fb84ff33b35aa02755add4845f493d10ec9cf3f4873f637"} Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.456597 4789 scope.go:117] "RemoveContainer" containerID="a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.456677 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.469105 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8680ae27-3e72-416b-9983-9b195fedcefb","Type":"ContainerStarted","Data":"7572bea88ddbb6452e0666949ad98511513c23c8c2ce5bf46cd5129000010ee4"} Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.494003 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.499641 4789 scope.go:117] "RemoveContainer" containerID="1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.503107 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.515439 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:15:31 crc kubenswrapper[4789]: E1216 07:15:30.515821 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-log" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.515836 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-log" Dec 16 07:15:31 crc kubenswrapper[4789]: E1216 07:15:30.515865 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-metadata" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.515872 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-metadata" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.516082 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-log" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.516107 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" containerName="nova-metadata-metadata" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.517102 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.520360 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.521714 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.533238 4789 scope.go:117] "RemoveContainer" containerID="a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5" Dec 16 07:15:31 crc kubenswrapper[4789]: E1216 07:15:30.533687 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5\": container with ID starting with a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5 not found: ID does not exist" containerID="a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.533726 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5"} err="failed to get container status \"a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5\": rpc error: code = NotFound desc = could not find container \"a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5\": container with ID starting with a073f0e9ab9cd66db77fae70e081b35bcbeb83db6858d7c32abf056f4b9991e5 not found: ID does not exist" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.533754 4789 scope.go:117] "RemoveContainer" containerID="1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c" Dec 16 07:15:31 crc kubenswrapper[4789]: E1216 07:15:30.540089 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c\": container with ID starting with 1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c not found: ID does not exist" containerID="1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.540134 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c"} err="failed to get container status \"1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c\": rpc error: code = NotFound desc = could not find container \"1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c\": container with ID starting with 1e64c611224b96a5b5f781a6c7f4501c3b0c7d49bdd5bfe4b1b5cf93fc4d9d6c not found: ID does not exist" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.540215 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.544501 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5031d0ac-42ac-4346-9403-0369a555ab4a-logs\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.545003 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-config-data\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.545085 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.545118 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b6mh\" (UniqueName: \"kubernetes.io/projected/5031d0ac-42ac-4346-9403-0369a555ab4a-kube-api-access-4b6mh\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.545204 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.647140 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-config-data\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.647215 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.647240 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b6mh\" (UniqueName: \"kubernetes.io/projected/5031d0ac-42ac-4346-9403-0369a555ab4a-kube-api-access-4b6mh\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.647996 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.648409 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5031d0ac-42ac-4346-9403-0369a555ab4a-logs\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.648951 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5031d0ac-42ac-4346-9403-0369a555ab4a-logs\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.651800 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-config-data\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.652099 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.654652 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.668527 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b6mh\" (UniqueName: \"kubernetes.io/projected/5031d0ac-42ac-4346-9403-0369a555ab4a-kube-api-access-4b6mh\") pod \"nova-metadata-0\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:30.835113 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.489519 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0779534-cd98-458a-83d8-df38579eb250" containerID="df2a294286c1b8afd2a5ef7eccb689de271e71f689ca17119dae658fad791112" exitCode=0 Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.490971 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0779534-cd98-458a-83d8-df38579eb250","Type":"ContainerDied","Data":"df2a294286c1b8afd2a5ef7eccb689de271e71f689ca17119dae658fad791112"} Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.500046 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8680ae27-3e72-416b-9983-9b195fedcefb","Type":"ContainerStarted","Data":"3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b"} Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.522752 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.522730688 podStartE2EDuration="2.522730688s" podCreationTimestamp="2025-12-16 07:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:31.520643407 +0000 UTC m=+1469.782531036" watchObservedRunningTime="2025-12-16 07:15:31.522730688 +0000 UTC m=+1469.784618317" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.653383 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.675836 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-combined-ca-bundle\") pod \"a0779534-cd98-458a-83d8-df38579eb250\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.676240 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-internal-tls-certs\") pod \"a0779534-cd98-458a-83d8-df38579eb250\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.676395 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-config-data\") pod \"a0779534-cd98-458a-83d8-df38579eb250\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.676548 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0779534-cd98-458a-83d8-df38579eb250-logs\") pod \"a0779534-cd98-458a-83d8-df38579eb250\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.676636 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-public-tls-certs\") pod \"a0779534-cd98-458a-83d8-df38579eb250\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.676722 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk2wz\" (UniqueName: \"kubernetes.io/projected/a0779534-cd98-458a-83d8-df38579eb250-kube-api-access-sk2wz\") pod \"a0779534-cd98-458a-83d8-df38579eb250\" (UID: \"a0779534-cd98-458a-83d8-df38579eb250\") " Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.677581 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0779534-cd98-458a-83d8-df38579eb250-logs" (OuterVolumeSpecName: "logs") pod "a0779534-cd98-458a-83d8-df38579eb250" (UID: "a0779534-cd98-458a-83d8-df38579eb250"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.678348 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0779534-cd98-458a-83d8-df38579eb250-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.684172 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0779534-cd98-458a-83d8-df38579eb250-kube-api-access-sk2wz" (OuterVolumeSpecName: "kube-api-access-sk2wz") pod "a0779534-cd98-458a-83d8-df38579eb250" (UID: "a0779534-cd98-458a-83d8-df38579eb250"). InnerVolumeSpecName "kube-api-access-sk2wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.710818 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0779534-cd98-458a-83d8-df38579eb250" (UID: "a0779534-cd98-458a-83d8-df38579eb250"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.725108 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-config-data" (OuterVolumeSpecName: "config-data") pod "a0779534-cd98-458a-83d8-df38579eb250" (UID: "a0779534-cd98-458a-83d8-df38579eb250"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.749228 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.750041 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0779534-cd98-458a-83d8-df38579eb250" (UID: "a0779534-cd98-458a-83d8-df38579eb250"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: W1216 07:15:31.756707 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5031d0ac_42ac_4346_9403_0369a555ab4a.slice/crio-12bc2b0ccc4896a733ff6dd23c362de481d00b71e764660a9d3291b2c0241000 WatchSource:0}: Error finding container 12bc2b0ccc4896a733ff6dd23c362de481d00b71e764660a9d3291b2c0241000: Status 404 returned error can't find the container with id 12bc2b0ccc4896a733ff6dd23c362de481d00b71e764660a9d3291b2c0241000 Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.780566 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.780705 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk2wz\" (UniqueName: \"kubernetes.io/projected/a0779534-cd98-458a-83d8-df38579eb250-kube-api-access-sk2wz\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.780763 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.780836 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.786271 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0779534-cd98-458a-83d8-df38579eb250" (UID: "a0779534-cd98-458a-83d8-df38579eb250"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:15:31 crc kubenswrapper[4789]: I1216 07:15:31.883108 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0779534-cd98-458a-83d8-df38579eb250-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.121839 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01cde849-67ae-4c7f-b288-04aa02b02fc9" path="/var/lib/kubelet/pods/01cde849-67ae-4c7f-b288-04aa02b02fc9/volumes" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.511136 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0779534-cd98-458a-83d8-df38579eb250","Type":"ContainerDied","Data":"334eca73992cf1f957dbbb1d934a2388845abccf75b899414c0d554124b28ff1"} Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.511473 4789 scope.go:117] "RemoveContainer" containerID="df2a294286c1b8afd2a5ef7eccb689de271e71f689ca17119dae658fad791112" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.511410 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.514496 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5031d0ac-42ac-4346-9403-0369a555ab4a","Type":"ContainerStarted","Data":"eea19a9862fbb3d803e89b2d7c5b8ef5c9fd9bd0d293359d33b0d30a3cecccac"} Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.514527 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5031d0ac-42ac-4346-9403-0369a555ab4a","Type":"ContainerStarted","Data":"c70725fd021c9c4c1eba5b71db3f401cd478eb9326f0510427dc30f5843bb19c"} Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.514541 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5031d0ac-42ac-4346-9403-0369a555ab4a","Type":"ContainerStarted","Data":"12bc2b0ccc4896a733ff6dd23c362de481d00b71e764660a9d3291b2c0241000"} Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.540020 4789 scope.go:117] "RemoveContainer" containerID="49c5368cc9ee172d9613be1998a8fcea541b7c1d603b5d95de3280ea5a8a2977" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.545752 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.545733973 podStartE2EDuration="2.545733973s" podCreationTimestamp="2025-12-16 07:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:32.539351207 +0000 UTC m=+1470.801238846" watchObservedRunningTime="2025-12-16 07:15:32.545733973 +0000 UTC m=+1470.807621602" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.573867 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.588337 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.599946 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:32 crc kubenswrapper[4789]: E1216 07:15:32.600540 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-api" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.600577 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-api" Dec 16 07:15:32 crc kubenswrapper[4789]: E1216 07:15:32.600600 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-log" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.600608 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-log" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.600849 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-api" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.600884 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0779534-cd98-458a-83d8-df38579eb250" containerName="nova-api-log" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.609603 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.612757 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.613841 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.614081 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.614299 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.712152 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-config-data\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.712239 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-logs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.712305 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.712452 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.712489 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9stf\" (UniqueName: \"kubernetes.io/projected/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-kube-api-access-c9stf\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.712522 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.814252 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-logs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.814645 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.815006 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.815621 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9stf\" (UniqueName: \"kubernetes.io/projected/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-kube-api-access-c9stf\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.815832 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.815433 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-logs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.816522 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-config-data\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.821187 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.821361 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.824595 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.825277 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-config-data\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.839579 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9stf\" (UniqueName: \"kubernetes.io/projected/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-kube-api-access-c9stf\") pod \"nova-api-0\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " pod="openstack/nova-api-0" Dec 16 07:15:32 crc kubenswrapper[4789]: I1216 07:15:32.929325 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:15:33 crc kubenswrapper[4789]: I1216 07:15:33.356500 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:15:33 crc kubenswrapper[4789]: I1216 07:15:33.526432 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84","Type":"ContainerStarted","Data":"4d63dd7640d74fbc26ecb3092e3b345de818b9a1c89962de76be7288485fe546"} Dec 16 07:15:33 crc kubenswrapper[4789]: I1216 07:15:33.526981 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84","Type":"ContainerStarted","Data":"6a2254f0c2484dc8347faedb131ca81c1258a021d3fe314d12a7dd12d706b108"} Dec 16 07:15:34 crc kubenswrapper[4789]: I1216 07:15:34.117554 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0779534-cd98-458a-83d8-df38579eb250" path="/var/lib/kubelet/pods/a0779534-cd98-458a-83d8-df38579eb250/volumes" Dec 16 07:15:34 crc kubenswrapper[4789]: I1216 07:15:34.540759 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84","Type":"ContainerStarted","Data":"17038ebe8bb333b1b73372f442dc14a740d2ad41822921ccc7adc857ef4a9c8b"} Dec 16 07:15:34 crc kubenswrapper[4789]: I1216 07:15:34.572720 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.572698949 podStartE2EDuration="2.572698949s" podCreationTimestamp="2025-12-16 07:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:15:34.560543213 +0000 UTC m=+1472.822430862" watchObservedRunningTime="2025-12-16 07:15:34.572698949 +0000 UTC m=+1472.834586578" Dec 16 07:15:34 crc kubenswrapper[4789]: I1216 07:15:34.833746 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 07:15:35 crc kubenswrapper[4789]: E1216 07:15:35.675284 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7e593c_0688_4ff7_b959_37f36a74aa2b.slice/crio-be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5\": RecentStats: unable to find data in memory cache]" Dec 16 07:15:35 crc kubenswrapper[4789]: I1216 07:15:35.835253 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:15:35 crc kubenswrapper[4789]: I1216 07:15:35.835404 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 07:15:39 crc kubenswrapper[4789]: I1216 07:15:39.833289 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 07:15:39 crc kubenswrapper[4789]: I1216 07:15:39.860317 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 07:15:40 crc kubenswrapper[4789]: I1216 07:15:40.622866 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 07:15:40 crc kubenswrapper[4789]: I1216 07:15:40.836395 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:15:40 crc kubenswrapper[4789]: I1216 07:15:40.836473 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 07:15:41 crc kubenswrapper[4789]: I1216 07:15:41.849144 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:15:41 crc kubenswrapper[4789]: I1216 07:15:41.849145 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:15:42 crc kubenswrapper[4789]: I1216 07:15:42.929547 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:15:42 crc kubenswrapper[4789]: I1216 07:15:42.930032 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 07:15:43 crc kubenswrapper[4789]: I1216 07:15:43.947385 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:15:43 crc kubenswrapper[4789]: I1216 07:15:43.947410 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 07:15:45 crc kubenswrapper[4789]: I1216 07:15:45.697970 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 07:15:45 crc kubenswrapper[4789]: E1216 07:15:45.910982 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7e593c_0688_4ff7_b959_37f36a74aa2b.slice/crio-be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5\": RecentStats: unable to find data in memory cache]" Dec 16 07:15:50 crc kubenswrapper[4789]: I1216 07:15:50.840179 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 07:15:50 crc kubenswrapper[4789]: I1216 07:15:50.840861 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 07:15:50 crc kubenswrapper[4789]: I1216 07:15:50.845236 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 07:15:51 crc kubenswrapper[4789]: I1216 07:15:51.699252 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 07:15:51 crc kubenswrapper[4789]: I1216 07:15:51.927978 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:15:51 crc kubenswrapper[4789]: I1216 07:15:51.928111 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:15:51 crc kubenswrapper[4789]: I1216 07:15:51.928218 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:15:51 crc kubenswrapper[4789]: I1216 07:15:51.929606 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8393873a978af7e8e2aad1167caa21ec29d5fd3e46fb65f45bf1708f741ab20"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:15:51 crc kubenswrapper[4789]: I1216 07:15:51.929706 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://c8393873a978af7e8e2aad1167caa21ec29d5fd3e46fb65f45bf1708f741ab20" gracePeriod=600 Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.706392 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="c8393873a978af7e8e2aad1167caa21ec29d5fd3e46fb65f45bf1708f741ab20" exitCode=0 Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.706474 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"c8393873a978af7e8e2aad1167caa21ec29d5fd3e46fb65f45bf1708f741ab20"} Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.707262 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6"} Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.707286 4789 scope.go:117] "RemoveContainer" containerID="a9c7a67d0b05df89259805e04a44c28da359f8954db5a37cfc842fbdb4aa2e7a" Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.939820 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.940324 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.945211 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 07:15:52 crc kubenswrapper[4789]: I1216 07:15:52.946127 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:15:53 crc kubenswrapper[4789]: I1216 07:15:53.719186 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 07:15:53 crc kubenswrapper[4789]: I1216 07:15:53.724669 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 07:15:56 crc kubenswrapper[4789]: E1216 07:15:56.155254 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e7e593c_0688_4ff7_b959_37f36a74aa2b.slice/crio-be1f8946da40081f7f8b76e783bb468455f761ba9b8a0ec4ef496fda703284f5\": RecentStats: unable to find data in memory cache]" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.492064 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.492822 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" containerName="openstackclient" containerID="cri-o://f8d1a6dd5068981a14ce87c5e3a5413bba046f93f1f41541bf47903d89cd9291" gracePeriod=2 Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.515659 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.596420 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6ccb68857-5qpdn"] Dec 16 07:16:11 crc kubenswrapper[4789]: E1216 07:16:11.597310 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" containerName="openstackclient" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.597416 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" containerName="openstackclient" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.597742 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" containerName="openstackclient" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.599037 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.629709 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5978f7f754-pzhh6"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.631411 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.673500 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ccb68857-5qpdn"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674494 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-combined-ca-bundle\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674527 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674564 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f668c2-651f-48f2-8feb-7faa470c3a19-logs\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674591 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674631 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2mp2\" (UniqueName: \"kubernetes.io/projected/24f668c2-651f-48f2-8feb-7faa470c3a19-kube-api-access-w2mp2\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674649 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqflz\" (UniqueName: \"kubernetes.io/projected/c5bd2649-9508-49bb-833e-7239b7d11d78-kube-api-access-nqflz\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674672 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data-custom\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674730 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-combined-ca-bundle\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674786 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data-custom\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.674815 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5bd2649-9508-49bb-833e-7239b7d11d78-logs\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.720992 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5978f7f754-pzhh6"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780067 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-combined-ca-bundle\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780135 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780178 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f668c2-651f-48f2-8feb-7faa470c3a19-logs\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780199 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780233 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2mp2\" (UniqueName: \"kubernetes.io/projected/24f668c2-651f-48f2-8feb-7faa470c3a19-kube-api-access-w2mp2\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780263 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqflz\" (UniqueName: \"kubernetes.io/projected/c5bd2649-9508-49bb-833e-7239b7d11d78-kube-api-access-nqflz\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780289 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data-custom\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780336 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-combined-ca-bundle\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780383 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data-custom\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780408 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5bd2649-9508-49bb-833e-7239b7d11d78-logs\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.780837 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5bd2649-9508-49bb-833e-7239b7d11d78-logs\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.784949 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.792016 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f668c2-651f-48f2-8feb-7faa470c3a19-logs\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.792021 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.795717 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-combined-ca-bundle\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.803714 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-combined-ca-bundle\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.821066 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.822446 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data-custom\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.826922 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data-custom\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.833491 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqflz\" (UniqueName: \"kubernetes.io/projected/c5bd2649-9508-49bb-833e-7239b7d11d78-kube-api-access-nqflz\") pod \"barbican-worker-6ccb68857-5qpdn\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.887271 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder9a65-account-delete-cbfmk"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.888707 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.921568 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2mp2\" (UniqueName: \"kubernetes.io/projected/24f668c2-651f-48f2-8feb-7faa470c3a19-kube-api-access-w2mp2\") pod \"barbican-keystone-listener-5978f7f754-pzhh6\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.925429 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.926377 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder9a65-account-delete-cbfmk"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.948951 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.958221 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.958782 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="openstack-network-exporter" containerID="cri-o://89655488a9eed182dbfe9d3dcd61937c370e55101b1d14bd72ae2d0e119b37e0" gracePeriod=300 Dec 16 07:16:11 crc kubenswrapper[4789]: E1216 07:16:11.985540 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:16:11 crc kubenswrapper[4789]: E1216 07:16:11.985596 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data podName:31336d9f-38cf-4805-927b-3ae986f6c88e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:12.485581268 +0000 UTC m=+1510.747468897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data") pod "rabbitmq-server-0" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e") : configmap "rabbitmq-config-data" not found Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.986406 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b855dbb8b-d8wxq"] Dec 16 07:16:11 crc kubenswrapper[4789]: I1216 07:16:11.999452 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.019029 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b855dbb8b-d8wxq"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.061955 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-glqgh"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.081435 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-glqgh"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.086355 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/bf1af2cc-24b9-4786-befa-74623fca05f7-kube-api-access-bfw6l\") pod \"cinder9a65-account-delete-cbfmk\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.086431 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1af2cc-24b9-4786-befa-74623fca05f7-operator-scripts\") pod \"cinder9a65-account-delete-cbfmk\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.190106 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="ovsdbserver-nb" containerID="cri-o://bb2b0b90bf159eee0923971b8cd93d6fe6ac8b4ae7096de2af216bf8667a77a6" gracePeriod=300 Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.211619 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.211713 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data-custom\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.211777 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-internal-tls-certs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.211854 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc7s9\" (UniqueName: \"kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.211991 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-combined-ca-bundle\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.212041 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8a84b3-ac5c-4ac8-a302-591548a970dd-logs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.212064 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-public-tls-certs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.212123 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/bf1af2cc-24b9-4786-befa-74623fca05f7-kube-api-access-bfw6l\") pod \"cinder9a65-account-delete-cbfmk\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.212196 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1af2cc-24b9-4786-befa-74623fca05f7-operator-scripts\") pod \"cinder9a65-account-delete-cbfmk\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.229028 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1af2cc-24b9-4786-befa-74623fca05f7-operator-scripts\") pod \"cinder9a65-account-delete-cbfmk\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.262660 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/bf1af2cc-24b9-4786-befa-74623fca05f7-kube-api-access-bfw6l\") pod \"cinder9a65-account-delete-cbfmk\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.289022 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.314816 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.314884 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data-custom\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.314939 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-internal-tls-certs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.314985 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7s9\" (UniqueName: \"kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.315045 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-combined-ca-bundle\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.315120 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8a84b3-ac5c-4ac8-a302-591548a970dd-logs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.315144 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-public-tls-certs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.321028 4789 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.321102 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:12.82108496 +0000 UTC m=+1511.082972589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : secret "barbican-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.342461 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-public-tls-certs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.343129 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-internal-tls-certs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.343540 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-combined-ca-bundle\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.348535 4789 projected.go:194] Error preparing data for projected volume kube-api-access-nc7s9 for pod openstack/barbican-api-7b855dbb8b-d8wxq: failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.348597 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9 podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:12.848577399 +0000 UTC m=+1511.110465028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nc7s9" (UniqueName: "kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.353135 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8a84b3-ac5c-4ac8-a302-591548a970dd-logs\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.353499 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d632824-4eaa-4698-b244-88872be244b8" path="/var/lib/kubelet/pods/5d632824-4eaa-4698-b244-88872be244b8/volumes" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.354272 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementdcc5-account-delete-bdqp9"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.354488 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data-custom\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.355559 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementdcc5-account-delete-bdqp9"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.355581 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.355900 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.356158 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="openstack-network-exporter" containerID="cri-o://36135c2133321ad8536765d85522a5abfcb2970720fa91440550ac33b7490f25" gracePeriod=300 Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.365789 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance9ba2-account-delete-tgp7b"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.391549 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.433704 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shpn4\" (UniqueName: \"kubernetes.io/projected/24b89e30-7a68-4c02-8386-cc104108a8ea-kube-api-access-shpn4\") pod \"placementdcc5-account-delete-bdqp9\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.433846 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b89e30-7a68-4c02-8386-cc104108a8ea-operator-scripts\") pod \"placementdcc5-account-delete-bdqp9\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.450064 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance9ba2-account-delete-tgp7b"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.493568 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.502057 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron5298-account-delete-lmmgd"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.503713 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.512662 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron5298-account-delete-lmmgd"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.536195 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2338e7-2de7-4149-bb6a-ae978c7e096a-operator-scripts\") pod \"glance9ba2-account-delete-tgp7b\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.536512 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzb7\" (UniqueName: \"kubernetes.io/projected/7f2338e7-2de7-4149-bb6a-ae978c7e096a-kube-api-access-pdzb7\") pod \"glance9ba2-account-delete-tgp7b\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.536562 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shpn4\" (UniqueName: \"kubernetes.io/projected/24b89e30-7a68-4c02-8386-cc104108a8ea-kube-api-access-shpn4\") pod \"placementdcc5-account-delete-bdqp9\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.536604 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b89e30-7a68-4c02-8386-cc104108a8ea-operator-scripts\") pod \"placementdcc5-account-delete-bdqp9\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.537148 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.537212 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data podName:31336d9f-38cf-4805-927b-3ae986f6c88e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:13.537194277 +0000 UTC m=+1511.799081906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data") pod "rabbitmq-server-0" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e") : configmap "rabbitmq-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.537416 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b89e30-7a68-4c02-8386-cc104108a8ea-operator-scripts\") pod \"placementdcc5-account-delete-bdqp9\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.544097 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.544369 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="ovn-northd" containerID="cri-o://81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" gracePeriod=30 Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.544501 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="openstack-network-exporter" containerID="cri-o://c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985" gracePeriod=30 Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.574050 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican30bf-account-delete-mwl8m"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.575513 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.577465 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shpn4\" (UniqueName: \"kubernetes.io/projected/24b89e30-7a68-4c02-8386-cc104108a8ea-kube-api-access-shpn4\") pod \"placementdcc5-account-delete-bdqp9\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.587558 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican30bf-account-delete-mwl8m"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.596358 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="ovsdbserver-sb" containerID="cri-o://c12dcb355856793bb3ead4efde2d2c892f26c8b50ad22dd0a5a3468ce4f9c0a6" gracePeriod=300 Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.599516 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-bk864"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.608244 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-bk864"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.621441 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapifc07-account-delete-bxbmv"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.622838 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.638425 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jn5h\" (UniqueName: \"kubernetes.io/projected/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-kube-api-access-8jn5h\") pod \"barbican30bf-account-delete-mwl8m\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.638474 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82acf941-5ce6-4e18-bc6d-1809296622eb-operator-scripts\") pod \"neutron5298-account-delete-lmmgd\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.637986 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tblns"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.640648 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2338e7-2de7-4149-bb6a-ae978c7e096a-operator-scripts\") pod \"glance9ba2-account-delete-tgp7b\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.638644 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2338e7-2de7-4149-bb6a-ae978c7e096a-operator-scripts\") pod \"glance9ba2-account-delete-tgp7b\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.640697 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzb7\" (UniqueName: \"kubernetes.io/projected/7f2338e7-2de7-4149-bb6a-ae978c7e096a-kube-api-access-pdzb7\") pod \"glance9ba2-account-delete-tgp7b\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.640714 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkfm2\" (UniqueName: \"kubernetes.io/projected/82acf941-5ce6-4e18-bc6d-1809296622eb-kube-api-access-gkfm2\") pod \"neutron5298-account-delete-lmmgd\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.640772 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts\") pod \"barbican30bf-account-delete-mwl8m\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.641531 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.641565 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data podName:9452e1b2-42ec-47b6-96e1-2770c9e76db2 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:13.141554816 +0000 UTC m=+1511.403442445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data") pod "rabbitmq-cell1-server-0" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.667137 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ghcvz"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.667523 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-ghcvz" podUID="e23503f0-7f00-4d2d-830b-fed7db6e6a08" containerName="openstack-network-exporter" containerID="cri-o://a7f4f9abcbcd0342850b2b57ff633a5bfa01b4b748c0160240e6025712e3081c" gracePeriod=30 Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.702557 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzb7\" (UniqueName: \"kubernetes.io/projected/7f2338e7-2de7-4149-bb6a-ae978c7e096a-kube-api-access-pdzb7\") pod \"glance9ba2-account-delete-tgp7b\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.705597 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapifc07-account-delete-bxbmv"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.736407 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.745809 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts\") pod \"barbican30bf-account-delete-mwl8m\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.745964 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts\") pod \"novaapifc07-account-delete-bxbmv\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.746041 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jn5h\" (UniqueName: \"kubernetes.io/projected/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-kube-api-access-8jn5h\") pod \"barbican30bf-account-delete-mwl8m\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.746071 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82acf941-5ce6-4e18-bc6d-1809296622eb-operator-scripts\") pod \"neutron5298-account-delete-lmmgd\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.746096 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9dvd\" (UniqueName: \"kubernetes.io/projected/0491a70b-b044-4ec4-b179-778967cd4573-kube-api-access-q9dvd\") pod \"novaapifc07-account-delete-bxbmv\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.746212 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkfm2\" (UniqueName: \"kubernetes.io/projected/82acf941-5ce6-4e18-bc6d-1809296622eb-kube-api-access-gkfm2\") pod \"neutron5298-account-delete-lmmgd\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.747451 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts\") pod \"barbican30bf-account-delete-mwl8m\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.748097 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82acf941-5ce6-4e18-bc6d-1809296622eb-operator-scripts\") pod \"neutron5298-account-delete-lmmgd\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.794589 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-cw7z9"] Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.820343 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.834108 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkfm2\" (UniqueName: \"kubernetes.io/projected/82acf941-5ce6-4e18-bc6d-1809296622eb-kube-api-access-gkfm2\") pod \"neutron5298-account-delete-lmmgd\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.877452 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jn5h\" (UniqueName: \"kubernetes.io/projected/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-kube-api-access-8jn5h\") pod \"barbican30bf-account-delete-mwl8m\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.878569 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7s9\" (UniqueName: \"kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.878658 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts\") pod \"novaapifc07-account-delete-bxbmv\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.878723 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9dvd\" (UniqueName: \"kubernetes.io/projected/0491a70b-b044-4ec4-b179-778967cd4573-kube-api-access-q9dvd\") pod \"novaapifc07-account-delete-bxbmv\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.878777 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.879003 4789 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.879071 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:13.879057033 +0000 UTC m=+1512.140944652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : secret "barbican-config-data" not found Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.879955 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts\") pod \"novaapifc07-account-delete-bxbmv\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.897344 4789 projected.go:194] Error preparing data for projected volume kube-api-access-nc7s9 for pod openstack/barbican-api-7b855dbb8b-d8wxq: failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:12 crc kubenswrapper[4789]: E1216 07:16:12.897410 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9 podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:13.897392829 +0000 UTC m=+1512.159280458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nc7s9" (UniqueName: "kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.923135 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9dvd\" (UniqueName: \"kubernetes.io/projected/0491a70b-b044-4ec4-b179-778967cd4573-kube-api-access-q9dvd\") pod \"novaapifc07-account-delete-bxbmv\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:12 crc kubenswrapper[4789]: I1216 07:16:12.979412 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.020306 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell04bcc-account-delete-wzgv2"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.021521 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.030560 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.074832 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell04bcc-account-delete-wzgv2"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.107513 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkvn\" (UniqueName: \"kubernetes.io/projected/fdea835b-f122-4db5-b7c1-ca180d9f3853-kube-api-access-2kkvn\") pod \"novacell04bcc-account-delete-wzgv2\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.107592 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdea835b-f122-4db5-b7c1-ca180d9f3853-operator-scripts\") pod \"novacell04bcc-account-delete-wzgv2\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.119617 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2fceb99a-9dfd-4d79-a0fd-666390de4440/ovsdbserver-sb/0.log" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.119662 4789 generic.go:334] "Generic (PLEG): container finished" podID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerID="36135c2133321ad8536765d85522a5abfcb2970720fa91440550ac33b7490f25" exitCode=2 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.119727 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2fceb99a-9dfd-4d79-a0fd-666390de4440","Type":"ContainerDied","Data":"36135c2133321ad8536765d85522a5abfcb2970720fa91440550ac33b7490f25"} Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.130844 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ghcvz_e23503f0-7f00-4d2d-830b-fed7db6e6a08/openstack-network-exporter/0.log" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.130898 4789 generic.go:334] "Generic (PLEG): container finished" podID="e23503f0-7f00-4d2d-830b-fed7db6e6a08" containerID="a7f4f9abcbcd0342850b2b57ff633a5bfa01b4b748c0160240e6025712e3081c" exitCode=2 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.131046 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghcvz" event={"ID":"e23503f0-7f00-4d2d-830b-fed7db6e6a08","Type":"ContainerDied","Data":"a7f4f9abcbcd0342850b2b57ff633a5bfa01b4b748c0160240e6025712e3081c"} Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.150090 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.225241 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkvn\" (UniqueName: \"kubernetes.io/projected/fdea835b-f122-4db5-b7c1-ca180d9f3853-kube-api-access-2kkvn\") pod \"novacell04bcc-account-delete-wzgv2\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.225319 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdea835b-f122-4db5-b7c1-ca180d9f3853-operator-scripts\") pod \"novacell04bcc-account-delete-wzgv2\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.236941 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdea835b-f122-4db5-b7c1-ca180d9f3853-operator-scripts\") pod \"novacell04bcc-account-delete-wzgv2\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.237470 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.237518 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data podName:9452e1b2-42ec-47b6-96e1-2770c9e76db2 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:14.237502002 +0000 UTC m=+1512.499389631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data") pod "rabbitmq-cell1-server-0" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.239195 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-mgsqq"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.239606 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" podUID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerName="dnsmasq-dns" containerID="cri-o://1592998b94c2a7232306fe54bf6cc98c4eff08132169b26bf5e601d8973f4fa0" gracePeriod=10 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.273209 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9wqnt"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.304441 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9wqnt"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.326491 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6456012f-c7be-458c-a9a5-b3958ae72c2c/ovsdbserver-nb/0.log" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.326563 4789 generic.go:334] "Generic (PLEG): container finished" podID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerID="89655488a9eed182dbfe9d3dcd61937c370e55101b1d14bd72ae2d0e119b37e0" exitCode=2 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.326580 4789 generic.go:334] "Generic (PLEG): container finished" podID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerID="bb2b0b90bf159eee0923971b8cd93d6fe6ac8b4ae7096de2af216bf8667a77a6" exitCode=143 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.326691 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6456012f-c7be-458c-a9a5-b3958ae72c2c","Type":"ContainerDied","Data":"89655488a9eed182dbfe9d3dcd61937c370e55101b1d14bd72ae2d0e119b37e0"} Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.326732 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6456012f-c7be-458c-a9a5-b3958ae72c2c","Type":"ContainerDied","Data":"bb2b0b90bf159eee0923971b8cd93d6fe6ac8b4ae7096de2af216bf8667a77a6"} Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.342831 4789 generic.go:334] "Generic (PLEG): container finished" podID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerID="c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985" exitCode=2 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.342883 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"63f88379-6b15-47a6-bf24-7cf0b3edc56a","Type":"ContainerDied","Data":"c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985"} Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.375660 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkvn\" (UniqueName: \"kubernetes.io/projected/fdea835b-f122-4db5-b7c1-ca180d9f3853-kube-api-access-2kkvn\") pod \"novacell04bcc-account-delete-wzgv2\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.395783 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.396268 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="cinder-scheduler" containerID="cri-o://7a559de8c4fc233747aed0e14dd0fbf6aa46b087f910b2378d491cf160c0c80e" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.396483 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="probe" containerID="cri-o://1fd3bff06a6b8d682fd662de9fa43cca1d63dd71d4ca1dc0b4dd34b7b40fb7d8" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.429171 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kxd22"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.447946 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kxd22"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.507418 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9jvbz"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.524323 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9jvbz"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.536895 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlrbk"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.559555 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dlrbk"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.569879 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.583191 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4kpsb"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.601195 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4kpsb"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.615533 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b48fd45b4-hp2xw"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.615806 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b48fd45b4-hp2xw" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-log" containerID="cri-o://640501fd43f4fcce68155ec6cb24a721ad4ca1ea36ae7f97fe8a96f2974be91e" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.616041 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b48fd45b4-hp2xw" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-api" containerID="cri-o://f58f590eff39129dc3fd6cbf997894d78a3061978019b25eafe3b8d013aa5949" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.620617 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.620691 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data podName:31336d9f-38cf-4805-927b-3ae986f6c88e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:15.620671902 +0000 UTC m=+1513.882559531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data") pod "rabbitmq-server-0" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e") : configmap "rabbitmq-config-data" not found Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.649848 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.650068 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api-log" containerID="cri-o://075adba855be9f7509e9630110074278e486377f145b6b3fe7500199bbeb6d6c" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.650458 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api" containerID="cri-o://dfe47974cb64535408aeb67063f1a4814aa8aaad5cefa3463823fe0dd085e7b6" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.699969 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.700541 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-server" containerID="cri-o://7dd74cf2b547abd9c20fc6d29daa7d954817be3444474dc3629c37701cc99230" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.700900 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-updater" containerID="cri-o://edcd02c79a5409469199dea08015de9c6eeffbea5566bd3cd4db97a260e47fdd" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701041 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-auditor" containerID="cri-o://ec371978a44bd2c62cd3ea38c393bf36090b055edd6151b95aa9b353fbdb7387" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701046 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="swift-recon-cron" containerID="cri-o://abff080aef14c07b0b737efd0a65faff826c48715b5f1c2ab9b91640d17f6623" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701142 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-replicator" containerID="cri-o://e0c8a6f56c8022db43b02bf2bd015331c0cdd2235c3eca42b9e1e1f7f8bd3705" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701187 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-server" containerID="cri-o://7428e5236584f2fe103930cb1f61dd303456f8c0deb11b5bbb601d51deecfb66" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701227 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-reaper" containerID="cri-o://20593004d226e1585979c62630548d692855df2932aab4c7c86476377d9cc2cc" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701272 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-auditor" containerID="cri-o://b93f37726e0744613bc7b449e38506e91bd311f3c6efe8bbf38923fdf51b2146" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701313 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-replicator" containerID="cri-o://391f051ceefce6af95f3e5e5fc2ba9a787ede01ec802f107f998941a77f4283e" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701378 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="rsync" containerID="cri-o://d8238af7dbf15f23415f0c86259fcf9957fbc0b08bcb581d4f0624333c152ec1" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701488 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-expirer" containerID="cri-o://58f8b4cb7ddbfc39c3c2c236d8c52319b46445fa6bd8e36d14a249780702ad85" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701602 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-replicator" containerID="cri-o://dfd251c4b8cc4551da74250c7e1018cc05d1c34c1749b00d7314e5704a70d11c" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701610 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-auditor" containerID="cri-o://bf3fe2408d858c60b990dfb63b6c210d31747a7a36a94cb83c07d547d090370f" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701626 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-updater" containerID="cri-o://7f58e5c14558f31f6600906b48eb2e6f74d0e6249665f123eef015ba515b9e8b" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.701643 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-server" containerID="cri-o://c05e3cfb0b0446d45c6b1efc03786be1905a9914fbbf8eca279bc89ee3642716" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.743605 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5787d477bc-ccrwj"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.743872 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5787d477bc-ccrwj" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-api" containerID="cri-o://1811fc6d133a6d47f93c7b8e7704ffe66b0cb1ade5e47088042f32756e1e0944" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.744322 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5787d477bc-ccrwj" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-httpd" containerID="cri-o://d61236e0a1b169ed76d6b190800ead5dd0b19f9acc8d953c9f3b75b5c79591fd" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.780929 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zk74x"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.838484 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.838744 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-log" containerID="cri-o://1ca313e4e286bdf363a60db146512417c224d3addedf2f21605dd96befee2ec7" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.839729 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-httpd" containerID="cri-o://82b67d2f0b7d827d390a1737f28832c66819bfb58a92aae85e467319754e80a4" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.877333 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-zk74x"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.924242 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.943567 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.943704 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7s9\" (UniqueName: \"kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.944375 4789 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.944422 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:15.944407088 +0000 UTC m=+1514.206294717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : secret "barbican-config-data" not found Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.948240 4789 projected.go:194] Error preparing data for projected volume kube-api-access-nc7s9 for pod openstack/barbican-api-7b855dbb8b-d8wxq: failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:13 crc kubenswrapper[4789]: E1216 07:16:13.948299 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9 podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:15.948282852 +0000 UTC m=+1514.210170471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nc7s9" (UniqueName: "kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.976286 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5978f7f754-pzhh6"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.992991 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.993335 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-log" containerID="cri-o://e13e48820a043301c899b786263392f63ecec23d16cafd76439ce501fb5d2638" gracePeriod=30 Dec 16 07:16:13 crc kubenswrapper[4789]: I1216 07:16:13.993867 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-httpd" containerID="cri-o://79b3fcff6d02b1d1105cdaa7d49563ca416afc3d0d209ae94eae9fd336eca759" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.018541 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.022846 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerName="rabbitmq" containerID="cri-o://238b569af7959004c01bd0394274b3ef8d6991fd0c3fdae6cc211fa624cb5354" gracePeriod=604800 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.055139 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.055444 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-log" containerID="cri-o://c70725fd021c9c4c1eba5b71db3f401cd478eb9326f0510427dc30f5843bb19c" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.055621 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-metadata" containerID="cri-o://eea19a9862fbb3d803e89b2d7c5b8ef5c9fd9bd0d293359d33b0d30a3cecccac" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.104460 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.104737 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-log" containerID="cri-o://4d63dd7640d74fbc26ecb3092e3b345de818b9a1c89962de76be7288485fe546" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.108041 4789 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/swift-proxy-7f7b9cd85-4tf54" secret="" err="secret \"swift-swift-dockercfg-whtm2\" not found" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.105332 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-api" containerID="cri-o://17038ebe8bb333b1b73372f442dc14a740d2ad41822921ccc7adc857ef4a9c8b" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.156298 4789 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.156333 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.158619 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-7f7b9cd85-4tf54: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.158680 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift podName:fc02bf7e-2d67-40a4-94b0-5807631a5b2e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:14.65866378 +0000 UTC m=+1512.920551409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift") pod "swift-proxy-7f7b9cd85-4tf54" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.166131 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ebd2c4-dad5-403a-aa60-77241f62af72" path="/var/lib/kubelet/pods/11ebd2c4-dad5-403a-aa60-77241f62af72/volumes" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.167058 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f721de0-e915-40f9-9444-f3135f39072c" path="/var/lib/kubelet/pods/3f721de0-e915-40f9-9444-f3135f39072c/volumes" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.167556 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4284636e-4d98-4efc-a75a-18eada4a3a8d" path="/var/lib/kubelet/pods/4284636e-4d98-4efc-a75a-18eada4a3a8d/volumes" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.187808 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6" path="/var/lib/kubelet/pods/5c0faa76-d0ec-471c-bf3a-dfb991f4dfc6/volumes" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.188729 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a1b402-4791-402a-aa3e-b7f400007ac2" path="/var/lib/kubelet/pods/a0a1b402-4791-402a-aa3e-b7f400007ac2/volumes" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.189250 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a1b02c-34b2-4955-800f-e21970da98d9" path="/var/lib/kubelet/pods/c0a1b02c-34b2-4955-800f-e21970da98d9/volumes" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.190224 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f73fa10b-54a6-4292-be91-84657f4a43cd" path="/var/lib/kubelet/pods/f73fa10b-54a6-4292-be91-84657f4a43cd/volumes" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.190739 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jfvcn"] Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.203879 4789 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 07:16:14 crc kubenswrapper[4789]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 16 07:16:14 crc kubenswrapper[4789]: + source /usr/local/bin/container-scripts/functions Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNBridge=br-int Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNRemote=tcp:localhost:6642 Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNEncapType=geneve Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNAvailabilityZones= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ EnableChassisAsGateway=true Dec 16 07:16:14 crc kubenswrapper[4789]: ++ PhysicalNetworks= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNHostName= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 16 07:16:14 crc kubenswrapper[4789]: ++ ovs_dir=/var/lib/openvswitch Dec 16 07:16:14 crc kubenswrapper[4789]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 16 07:16:14 crc kubenswrapper[4789]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 16 07:16:14 crc kubenswrapper[4789]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + sleep 0.5 Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + sleep 0.5 Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + cleanup_ovsdb_server_semaphore Dec 16 07:16:14 crc kubenswrapper[4789]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:16:14 crc kubenswrapper[4789]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 16 07:16:14 crc kubenswrapper[4789]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-tblns" message=< Dec 16 07:16:14 crc kubenswrapper[4789]: Exiting ovsdb-server (5) [ OK ] Dec 16 07:16:14 crc kubenswrapper[4789]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 16 07:16:14 crc kubenswrapper[4789]: + source /usr/local/bin/container-scripts/functions Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNBridge=br-int Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNRemote=tcp:localhost:6642 Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNEncapType=geneve Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNAvailabilityZones= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ EnableChassisAsGateway=true Dec 16 07:16:14 crc kubenswrapper[4789]: ++ PhysicalNetworks= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNHostName= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 16 07:16:14 crc kubenswrapper[4789]: ++ ovs_dir=/var/lib/openvswitch Dec 16 07:16:14 crc kubenswrapper[4789]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 16 07:16:14 crc kubenswrapper[4789]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 16 07:16:14 crc kubenswrapper[4789]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + sleep 0.5 Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + sleep 0.5 Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + cleanup_ovsdb_server_semaphore Dec 16 07:16:14 crc kubenswrapper[4789]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:16:14 crc kubenswrapper[4789]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 16 07:16:14 crc kubenswrapper[4789]: > Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.204016 4789 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 07:16:14 crc kubenswrapper[4789]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 16 07:16:14 crc kubenswrapper[4789]: + source /usr/local/bin/container-scripts/functions Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNBridge=br-int Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNRemote=tcp:localhost:6642 Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNEncapType=geneve Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNAvailabilityZones= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ EnableChassisAsGateway=true Dec 16 07:16:14 crc kubenswrapper[4789]: ++ PhysicalNetworks= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ OVNHostName= Dec 16 07:16:14 crc kubenswrapper[4789]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 16 07:16:14 crc kubenswrapper[4789]: ++ ovs_dir=/var/lib/openvswitch Dec 16 07:16:14 crc kubenswrapper[4789]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 16 07:16:14 crc kubenswrapper[4789]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 16 07:16:14 crc kubenswrapper[4789]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + sleep 0.5 Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + sleep 0.5 Dec 16 07:16:14 crc kubenswrapper[4789]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 16 07:16:14 crc kubenswrapper[4789]: + cleanup_ovsdb_server_semaphore Dec 16 07:16:14 crc kubenswrapper[4789]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 16 07:16:14 crc kubenswrapper[4789]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 16 07:16:14 crc kubenswrapper[4789]: > pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" containerID="cri-o://b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.204051 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" containerID="cri-o://b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" gracePeriod=29 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.214599 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jfvcn"] Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.221215 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.223565 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.227009 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.227075 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="ovn-northd" Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.257589 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.258264 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data podName:9452e1b2-42ec-47b6-96e1-2770c9e76db2 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:16.258244332 +0000 UTC m=+1514.520131981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data") pod "rabbitmq-cell1-server-0" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.273603 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e2f9-account-create-update-hjzt7"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.308666 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6456012f-c7be-458c-a9a5-b3958ae72c2c/ovsdbserver-nb/0.log" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.311007 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.345248 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e2f9-account-create-update-hjzt7"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.362432 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" containerID="cri-o://4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" gracePeriod=29 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.363627 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ghcvz_e23503f0-7f00-4d2d-830b-fed7db6e6a08/openstack-network-exporter/0.log" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.363726 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.363729 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdbserver-nb-tls-certs\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.363826 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.363883 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-scripts\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.363950 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkcjb\" (UniqueName: \"kubernetes.io/projected/6456012f-c7be-458c-a9a5-b3958ae72c2c-kube-api-access-fkcjb\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.364453 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-combined-ca-bundle\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.364537 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdb-rundir\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.364642 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-metrics-certs-tls-certs\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.364702 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-config\") pod \"6456012f-c7be-458c-a9a5-b3958ae72c2c\" (UID: \"6456012f-c7be-458c-a9a5-b3958ae72c2c\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.364727 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-scripts" (OuterVolumeSpecName: "scripts") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.368145 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.368293 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.368772 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-config" (OuterVolumeSpecName: "config") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.391004 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bbbc99994-cwbpw"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.391467 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bbbc99994-cwbpw" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api-log" containerID="cri-o://f71a167007b51e6b5519402191320729c426676f344ae45a6739ee1603881192" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.391598 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bbbc99994-cwbpw" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api" containerID="cri-o://063de08125ee34cdd307d91af26a93c26622ecd506fc1ef247a55845563c91d4" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.392000 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.407842 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.412100 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6456012f-c7be-458c-a9a5-b3958ae72c2c-kube-api-access-fkcjb" (OuterVolumeSpecName: "kube-api-access-fkcjb") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "kube-api-access-fkcjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.416702 4789 generic.go:334] "Generic (PLEG): container finished" podID="8368d044-b088-48f9-b5cb-19a95b997576" containerID="640501fd43f4fcce68155ec6cb24a721ad4ca1ea36ae7f97fe8a96f2974be91e" exitCode=143 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.417024 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b855dbb8b-d8wxq"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.417064 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b48fd45b4-hp2xw" event={"ID":"8368d044-b088-48f9-b5cb-19a95b997576","Type":"ContainerDied","Data":"640501fd43f4fcce68155ec6cb24a721ad4ca1ea36ae7f97fe8a96f2974be91e"} Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.421235 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data kube-api-access-nc7s9], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/barbican-api-7b855dbb8b-d8wxq" podUID="fe8a84b3-ac5c-4ac8-a302-591548a970dd" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.428485 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.428761 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener-log" containerID="cri-o://38cb122304e1fdaa474b979a230d2c0c3d7a6825b41760a62e2b6923a3837070" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.429405 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener" containerID="cri-o://2339da9c2a0e96f20cb674ffa23240a66f38f9520dccf9e45eba3c48effd6316" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.439471 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5978f7f754-pzhh6"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.452528 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6ccb68857-5qpdn"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.454208 4789 generic.go:334] "Generic (PLEG): container finished" podID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerID="c70725fd021c9c4c1eba5b71db3f401cd478eb9326f0510427dc30f5843bb19c" exitCode=143 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.454317 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5031d0ac-42ac-4346-9403-0369a555ab4a","Type":"ContainerDied","Data":"c70725fd021c9c4c1eba5b71db3f401cd478eb9326f0510427dc30f5843bb19c"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.466607 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-864d99d789-mv5rh"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.468066 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-864d99d789-mv5rh" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker" containerID="cri-o://0f48a94f282f9b6fdc9bae2f55acd33cf0b6397237ba430e41c42e3e2660b0b4" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.471797 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-864d99d789-mv5rh" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker-log" containerID="cri-o://305b96e7f12f126be8501fb24906a6e570466b444e3bf1c2ee9a66e30d5add7f" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.473376 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23503f0-7f00-4d2d-830b-fed7db6e6a08-config\") pod \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.473524 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6cbk\" (UniqueName: \"kubernetes.io/projected/e23503f0-7f00-4d2d-830b-fed7db6e6a08-kube-api-access-r6cbk\") pod \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.473632 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-combined-ca-bundle\") pod \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.474185 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovn-rundir\") pod \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.475988 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e23503f0-7f00-4d2d-830b-fed7db6e6a08-config" (OuterVolumeSpecName: "config") pod "e23503f0-7f00-4d2d-830b-fed7db6e6a08" (UID: "e23503f0-7f00-4d2d-830b-fed7db6e6a08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.475995 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovs-rundir\") pod \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.476142 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-metrics-certs-tls-certs\") pod \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\" (UID: \"e23503f0-7f00-4d2d-830b-fed7db6e6a08\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.476241 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "e23503f0-7f00-4d2d-830b-fed7db6e6a08" (UID: "e23503f0-7f00-4d2d-830b-fed7db6e6a08"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.476960 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.477402 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23503f0-7f00-4d2d-830b-fed7db6e6a08-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.477458 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkcjb\" (UniqueName: \"kubernetes.io/projected/6456012f-c7be-458c-a9a5-b3958ae72c2c-kube-api-access-fkcjb\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.477515 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.477566 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456012f-c7be-458c-a9a5-b3958ae72c2c-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.501961 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "e23503f0-7f00-4d2d-830b-fed7db6e6a08" (UID: "e23503f0-7f00-4d2d-830b-fed7db6e6a08"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.504521 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerName="galera" containerID="cri-o://eddf93e3cc7bc715cd0565a58d08d71f72cabb8839a08286cf1924335b971dde" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.505311 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2ln4f"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.561020 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.580136 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.580165 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.580176 4789 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e23503f0-7f00-4d2d-830b-fed7db6e6a08-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588728 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="d8238af7dbf15f23415f0c86259fcf9957fbc0b08bcb581d4f0624333c152ec1" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588786 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="58f8b4cb7ddbfc39c3c2c236d8c52319b46445fa6bd8e36d14a249780702ad85" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588849 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="7f58e5c14558f31f6600906b48eb2e6f74d0e6249665f123eef015ba515b9e8b" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588861 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="bf3fe2408d858c60b990dfb63b6c210d31747a7a36a94cb83c07d547d090370f" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588878 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="dfd251c4b8cc4551da74250c7e1018cc05d1c34c1749b00d7314e5704a70d11c" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588893 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="edcd02c79a5409469199dea08015de9c6eeffbea5566bd3cd4db97a260e47fdd" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588927 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="ec371978a44bd2c62cd3ea38c393bf36090b055edd6151b95aa9b353fbdb7387" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588967 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="e0c8a6f56c8022db43b02bf2bd015331c0cdd2235c3eca42b9e1e1f7f8bd3705" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.588989 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="20593004d226e1585979c62630548d692855df2932aab4c7c86476377d9cc2cc" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589008 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="b93f37726e0744613bc7b449e38506e91bd311f3c6efe8bbf38923fdf51b2146" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589018 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="391f051ceefce6af95f3e5e5fc2ba9a787ede01ec802f107f998941a77f4283e" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589158 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"d8238af7dbf15f23415f0c86259fcf9957fbc0b08bcb581d4f0624333c152ec1"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589226 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"58f8b4cb7ddbfc39c3c2c236d8c52319b46445fa6bd8e36d14a249780702ad85"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589259 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"7f58e5c14558f31f6600906b48eb2e6f74d0e6249665f123eef015ba515b9e8b"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589278 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"bf3fe2408d858c60b990dfb63b6c210d31747a7a36a94cb83c07d547d090370f"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589291 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"dfd251c4b8cc4551da74250c7e1018cc05d1c34c1749b00d7314e5704a70d11c"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589314 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"edcd02c79a5409469199dea08015de9c6eeffbea5566bd3cd4db97a260e47fdd"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589337 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"ec371978a44bd2c62cd3ea38c393bf36090b055edd6151b95aa9b353fbdb7387"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"e0c8a6f56c8022db43b02bf2bd015331c0cdd2235c3eca42b9e1e1f7f8bd3705"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589371 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"20593004d226e1585979c62630548d692855df2932aab4c7c86476377d9cc2cc"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589387 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"b93f37726e0744613bc7b449e38506e91bd311f3c6efe8bbf38923fdf51b2146"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.589401 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"391f051ceefce6af95f3e5e5fc2ba9a787ede01ec802f107f998941a77f4283e"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.612586 4789 generic.go:334] "Generic (PLEG): container finished" podID="37216df1-3f61-412b-bffb-5e36812383f4" containerID="1ca313e4e286bdf363a60db146512417c224d3addedf2f21605dd96befee2ec7" exitCode=143 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.612663 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37216df1-3f61-412b-bffb-5e36812383f4","Type":"ContainerDied","Data":"1ca313e4e286bdf363a60db146512417c224d3addedf2f21605dd96befee2ec7"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.619082 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ghcvz_e23503f0-7f00-4d2d-830b-fed7db6e6a08/openstack-network-exporter/0.log" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.619216 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ghcvz" event={"ID":"e23503f0-7f00-4d2d-830b-fed7db6e6a08","Type":"ContainerDied","Data":"1932411cb589fdeeffd4e760dc58986c5e9c987b080cd34d32d9f259a3620e50"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.619252 4789 scope.go:117] "RemoveContainer" containerID="a7f4f9abcbcd0342850b2b57ff633a5bfa01b4b748c0160240e6025712e3081c" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.619282 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ghcvz" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.622452 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2ln4f"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.624537 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e23503f0-7f00-4d2d-830b-fed7db6e6a08" (UID: "e23503f0-7f00-4d2d-830b-fed7db6e6a08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.626566 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" containerID="f8d1a6dd5068981a14ce87c5e3a5413bba046f93f1f41541bf47903d89cd9291" exitCode=137 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.628861 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23503f0-7f00-4d2d-830b-fed7db6e6a08-kube-api-access-r6cbk" (OuterVolumeSpecName: "kube-api-access-r6cbk") pod "e23503f0-7f00-4d2d-830b-fed7db6e6a08" (UID: "e23503f0-7f00-4d2d-830b-fed7db6e6a08"). InnerVolumeSpecName "kube-api-access-r6cbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.630682 4789 generic.go:334] "Generic (PLEG): container finished" podID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerID="e13e48820a043301c899b786263392f63ecec23d16cafd76439ce501fb5d2638" exitCode=143 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.630707 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6423ab7-79a3-402c-9115-e54b5f29ad05","Type":"ContainerDied","Data":"e13e48820a043301c899b786263392f63ecec23d16cafd76439ce501fb5d2638"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.639223 4789 generic.go:334] "Generic (PLEG): container finished" podID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerID="4d63dd7640d74fbc26ecb3092e3b345de818b9a1c89962de76be7288485fe546" exitCode=143 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.639381 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84","Type":"ContainerDied","Data":"4d63dd7640d74fbc26ecb3092e3b345de818b9a1c89962de76be7288485fe546"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.642062 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.647146 4789 generic.go:334] "Generic (PLEG): container finished" podID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerID="d61236e0a1b169ed76d6b190800ead5dd0b19f9acc8d953c9f3b75b5c79591fd" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.647219 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5787d477bc-ccrwj" event={"ID":"73660d16-d925-4e43-8df7-2c40959bb7ed","Type":"ContainerDied","Data":"d61236e0a1b169ed76d6b190800ead5dd0b19f9acc8d953c9f3b75b5c79591fd"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.650814 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2fceb99a-9dfd-4d79-a0fd-666390de4440/ovsdbserver-sb/0.log" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.651094 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.658594 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.663636 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6456012f-c7be-458c-a9a5-b3958ae72c2c/ovsdbserver-nb/0.log" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.663719 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6456012f-c7be-458c-a9a5-b3958ae72c2c","Type":"ContainerDied","Data":"5187726476a931cbb4c315efec1f0ad6089e99939e1b36488eb5216377147f30"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.663904 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.670631 4789 scope.go:117] "RemoveContainer" containerID="89655488a9eed182dbfe9d3dcd61937c370e55101b1d14bd72ae2d0e119b37e0" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.688675 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.688883 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="358d8958-a563-407c-8b7f-75aee19a3699" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.689219 4789 generic.go:334] "Generic (PLEG): container finished" podID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerID="1592998b94c2a7232306fe54bf6cc98c4eff08132169b26bf5e601d8973f4fa0" exitCode=0 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.689326 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" event={"ID":"0ff4de9f-c7d4-4d77-81be-7a499ead0f10","Type":"ContainerDied","Data":"1592998b94c2a7232306fe54bf6cc98c4eff08132169b26bf5e601d8973f4fa0"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.691026 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.691057 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.691070 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6cbk\" (UniqueName: \"kubernetes.io/projected/e23503f0-7f00-4d2d-830b-fed7db6e6a08-kube-api-access-r6cbk\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.691083 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.691188 4789 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.691203 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.691215 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-7f7b9cd85-4tf54: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 16 07:16:14 crc kubenswrapper[4789]: E1216 07:16:14.691267 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift podName:fc02bf7e-2d67-40a4-94b0-5807631a5b2e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:15.691248464 +0000 UTC m=+1513.953136093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift") pod "swift-proxy-7f7b9cd85-4tf54" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.692548 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.699554 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.699951 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="00293d36-0c18-4d79-aacd-4224045ff895" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://702d34f2d07c6bd63fc7dbcd22cbe07825238be81729f58170d0b20b26fc3bf9" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.705131 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6456012f-c7be-458c-a9a5-b3958ae72c2c" (UID: "6456012f-c7be-458c-a9a5-b3958ae72c2c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.706311 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2fceb99a-9dfd-4d79-a0fd-666390de4440/ovsdbserver-sb/0.log" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.706353 4789 generic.go:334] "Generic (PLEG): container finished" podID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerID="c12dcb355856793bb3ead4efde2d2c892f26c8b50ad22dd0a5a3468ce4f9c0a6" exitCode=143 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.706413 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2fceb99a-9dfd-4d79-a0fd-666390de4440","Type":"ContainerDied","Data":"c12dcb355856793bb3ead4efde2d2c892f26c8b50ad22dd0a5a3468ce4f9c0a6"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.706420 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.713709 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.733334 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xtjrv"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.741249 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e23503f0-7f00-4d2d-830b-fed7db6e6a08" (UID: "e23503f0-7f00-4d2d-830b-fed7db6e6a08"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.747233 4789 generic.go:334] "Generic (PLEG): container finished" podID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerID="075adba855be9f7509e9630110074278e486377f145b6b3fe7500199bbeb6d6c" exitCode=143 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.747338 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5","Type":"ContainerDied","Data":"075adba855be9f7509e9630110074278e486377f145b6b3fe7500199bbeb6d6c"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.749455 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" event={"ID":"24f668c2-651f-48f2-8feb-7faa470c3a19","Type":"ContainerStarted","Data":"89604f15dd7e98d8241fb24132a1f1a8b21deee7462d248bc8212e3dbb60ee2c"} Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.762146 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xtjrv"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.774138 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.781237 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" containerName="nova-cell1-conductor-conductor" containerID="cri-o://219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.784180 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.794095 4789 scope.go:117] "RemoveContainer" containerID="bb2b0b90bf159eee0923971b8cd93d6fe6ac8b4ae7096de2af216bf8667a77a6" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.796771 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl6mk\" (UniqueName: \"kubernetes.io/projected/2fceb99a-9dfd-4d79-a0fd-666390de4440-kube-api-access-gl6mk\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.796959 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-config\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.797605 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-config" (OuterVolumeSpecName: "config") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.801355 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-scripts" (OuterVolumeSpecName: "scripts") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.797104 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-scripts\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805612 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdb-rundir\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805647 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdbserver-sb-tls-certs\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805673 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-sb\") pod \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805705 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbr2s\" (UniqueName: \"kubernetes.io/projected/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-kube-api-access-xbr2s\") pod \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805745 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-combined-ca-bundle\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805775 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-swift-storage-0\") pod \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805798 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc\") pod \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805848 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.805880 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-metrics-certs-tls-certs\") pod \"2fceb99a-9dfd-4d79-a0fd-666390de4440\" (UID: \"2fceb99a-9dfd-4d79-a0fd-666390de4440\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.806673 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.806691 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fceb99a-9dfd-4d79-a0fd-666390de4440-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.806699 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6456012f-c7be-458c-a9a5-b3958ae72c2c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.806709 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e23503f0-7f00-4d2d-830b-fed7db6e6a08-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.811802 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.820348 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fceb99a-9dfd-4d79-a0fd-666390de4440-kube-api-access-gl6mk" (OuterVolumeSpecName: "kube-api-access-gl6mk") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "kube-api-access-gl6mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.835308 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-kube-api-access-xbr2s" (OuterVolumeSpecName: "kube-api-access-xbr2s") pod "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" (UID: "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978"). InnerVolumeSpecName "kube-api-access-xbr2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.843058 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.874510 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6ccb68857-5qpdn"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.908512 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-config\") pod \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.908606 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config-secret\") pod \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.908636 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-nb\") pod \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.908721 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78m57\" (UniqueName: \"kubernetes.io/projected/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-kube-api-access-78m57\") pod \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.908755 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-combined-ca-bundle\") pod \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.908787 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config\") pod \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\" (UID: \"a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978\") " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.909888 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.909934 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbr2s\" (UniqueName: \"kubernetes.io/projected/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-kube-api-access-xbr2s\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.909957 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.909967 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl6mk\" (UniqueName: \"kubernetes.io/projected/2fceb99a-9dfd-4d79-a0fd-666390de4440-kube-api-access-gl6mk\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.943859 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.944081 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8680ae27-3e72-416b-9983-9b195fedcefb" containerName="nova-scheduler-scheduler" containerID="cri-o://3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b" gracePeriod=30 Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.948151 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-kube-api-access-78m57" (OuterVolumeSpecName: "kube-api-access-78m57") pod "0ff4de9f-c7d4-4d77-81be-7a499ead0f10" (UID: "0ff4de9f-c7d4-4d77-81be-7a499ead0f10"). InnerVolumeSpecName "kube-api-access-78m57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.984547 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:14 crc kubenswrapper[4789]: I1216 07:16:14.985382 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" (UID: "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.001254 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.001845 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerName="rabbitmq" containerID="cri-o://fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362" gracePeriod=604800 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.004794 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.010949 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ff4de9f-c7d4-4d77-81be-7a499ead0f10" (UID: "0ff4de9f-c7d4-4d77-81be-7a499ead0f10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.011471 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc\") pod \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\" (UID: \"0ff4de9f-c7d4-4d77-81be-7a499ead0f10\") " Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.012116 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.012145 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.012160 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.012172 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78m57\" (UniqueName: \"kubernetes.io/projected/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-kube-api-access-78m57\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.012183 4789 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: W1216 07:16:15.012274 4789 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0ff4de9f-c7d4-4d77-81be-7a499ead0f10/volumes/kubernetes.io~configmap/dns-svc Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.012287 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ff4de9f-c7d4-4d77-81be-7a499ead0f10" (UID: "0ff4de9f-c7d4-4d77-81be-7a499ead0f10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.014170 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" (UID: "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.030672 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ff4de9f-c7d4-4d77-81be-7a499ead0f10" (UID: "0ff4de9f-c7d4-4d77-81be-7a499ead0f10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.058031 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-config" (OuterVolumeSpecName: "config") pod "0ff4de9f-c7d4-4d77-81be-7a499ead0f10" (UID: "0ff4de9f-c7d4-4d77-81be-7a499ead0f10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.060366 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ff4de9f-c7d4-4d77-81be-7a499ead0f10" (UID: "0ff4de9f-c7d4-4d77-81be-7a499ead0f10"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.085659 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementdcc5-account-delete-bdqp9"] Dec 16 07:16:15 crc kubenswrapper[4789]: W1216 07:16:15.091114 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24b89e30_7a68_4c02_8386_cc104108a8ea.slice/crio-28328e634aa30218f149f9ef5d6f07fdb26459d3788ada2018a1f826ada9399e WatchSource:0}: Error finding container 28328e634aa30218f149f9ef5d6f07fdb26459d3788ada2018a1f826ada9399e: Status 404 returned error can't find the container with id 28328e634aa30218f149f9ef5d6f07fdb26459d3788ada2018a1f826ada9399e Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.091482 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2fceb99a-9dfd-4d79-a0fd-666390de4440" (UID: "2fceb99a-9dfd-4d79-a0fd-666390de4440"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.095540 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" (UID: "a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.114569 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.114608 4789 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.114697 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.114711 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fceb99a-9dfd-4d79-a0fd-666390de4440-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.114724 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.114735 4789 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.114748 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.140888 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ff4de9f-c7d4-4d77-81be-7a499ead0f10" (UID: "0ff4de9f-c7d4-4d77-81be-7a499ead0f10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.166483 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder9a65-account-delete-cbfmk"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.216837 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ff4de9f-c7d4-4d77-81be-7a499ead0f10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.303633 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.333786 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance9ba2-account-delete-tgp7b"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.387214 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ghcvz"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.401782 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-ghcvz"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.420152 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.428062 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.434970 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.436972 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" containerName="nova-cell1-conductor-conductor" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.444291 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.458127 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapifc07-account-delete-bxbmv"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.469084 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron5298-account-delete-lmmgd"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.482055 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.493144 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.501257 4789 scope.go:117] "RemoveContainer" containerID="36135c2133321ad8536765d85522a5abfcb2970720fa91440550ac33b7490f25" Dec 16 07:16:15 crc kubenswrapper[4789]: W1216 07:16:15.521411 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0491a70b_b044_4ec4_b179_778967cd4573.slice/crio-0930e632f5911a19af40b37bfd265349ca7f713b1462c8df6cc8e79a6d764317 WatchSource:0}: Error finding container 0930e632f5911a19af40b37bfd265349ca7f713b1462c8df6cc8e79a6d764317: Status 404 returned error can't find the container with id 0930e632f5911a19af40b37bfd265349ca7f713b1462c8df6cc8e79a6d764317 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.543956 4789 scope.go:117] "RemoveContainer" containerID="c12dcb355856793bb3ead4efde2d2c892f26c8b50ad22dd0a5a3468ce4f9c0a6" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.562476 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell04bcc-account-delete-wzgv2"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.608665 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican30bf-account-delete-mwl8m"] Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.624876 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.624967 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data podName:31336d9f-38cf-4805-927b-3ae986f6c88e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:19.624950657 +0000 UTC m=+1517.886838286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data") pod "rabbitmq-server-0" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e") : configmap "rabbitmq-config-data" not found Dec 16 07:16:15 crc kubenswrapper[4789]: W1216 07:16:15.656539 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1964cf41_49d7_4b0d_ab8b_fbf9b621e359.slice/crio-1f176ee0624459f5cffe99a925013e3dc2f1324da883fbea30790016109ae3fa WatchSource:0}: Error finding container 1f176ee0624459f5cffe99a925013e3dc2f1324da883fbea30790016109ae3fa: Status 404 returned error can't find the container with id 1f176ee0624459f5cffe99a925013e3dc2f1324da883fbea30790016109ae3fa Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.726677 4789 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.726714 4789 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.726723 4789 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-7f7b9cd85-4tf54: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 16 07:16:15 crc kubenswrapper[4789]: E1216 07:16:15.726779 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift podName:fc02bf7e-2d67-40a4-94b0-5807631a5b2e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:17.726751753 +0000 UTC m=+1515.988639382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift") pod "swift-proxy-7f7b9cd85-4tf54" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.841107 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f7b9cd85-4tf54"] Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.842003 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f7b9cd85-4tf54" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-httpd" containerID="cri-o://f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0" gracePeriod=30 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.842153 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f7b9cd85-4tf54" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-server" containerID="cri-o://db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac" gracePeriod=30 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.853315 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="c05e3cfb0b0446d45c6b1efc03786be1905a9914fbbf8eca279bc89ee3642716" exitCode=0 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.853343 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="7428e5236584f2fe103930cb1f61dd303456f8c0deb11b5bbb601d51deecfb66" exitCode=0 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.853351 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="7dd74cf2b547abd9c20fc6d29daa7d954817be3444474dc3629c37701cc99230" exitCode=0 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.853392 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"c05e3cfb0b0446d45c6b1efc03786be1905a9914fbbf8eca279bc89ee3642716"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.853415 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"7428e5236584f2fe103930cb1f61dd303456f8c0deb11b5bbb601d51deecfb66"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.853434 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"7dd74cf2b547abd9c20fc6d29daa7d954817be3444474dc3629c37701cc99230"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.895097 4789 generic.go:334] "Generic (PLEG): container finished" podID="de637363-990a-4590-b9c5-ab66c18ec270" containerID="1fd3bff06a6b8d682fd662de9fa43cca1d63dd71d4ca1dc0b4dd34b7b40fb7d8" exitCode=0 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.895128 4789 generic.go:334] "Generic (PLEG): container finished" podID="de637363-990a-4590-b9c5-ab66c18ec270" containerID="7a559de8c4fc233747aed0e14dd0fbf6aa46b087f910b2378d491cf160c0c80e" exitCode=0 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.895180 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de637363-990a-4590-b9c5-ab66c18ec270","Type":"ContainerDied","Data":"1fd3bff06a6b8d682fd662de9fa43cca1d63dd71d4ca1dc0b4dd34b7b40fb7d8"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.895207 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de637363-990a-4590-b9c5-ab66c18ec270","Type":"ContainerDied","Data":"7a559de8c4fc233747aed0e14dd0fbf6aa46b087f910b2378d491cf160c0c80e"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.898832 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9ba2-account-delete-tgp7b" event={"ID":"7f2338e7-2de7-4149-bb6a-ae978c7e096a","Type":"ContainerStarted","Data":"6b9cc156b6a77e8c8e3f7c557a6352e3ae3051572a21e9eec09e97cc507e2c42"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.921852 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementdcc5-account-delete-bdqp9" event={"ID":"24b89e30-7a68-4c02-8386-cc104108a8ea","Type":"ContainerStarted","Data":"28328e634aa30218f149f9ef5d6f07fdb26459d3788ada2018a1f826ada9399e"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.928900 4789 generic.go:334] "Generic (PLEG): container finished" podID="00293d36-0c18-4d79-aacd-4224045ff895" containerID="702d34f2d07c6bd63fc7dbcd22cbe07825238be81729f58170d0b20b26fc3bf9" exitCode=0 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.928966 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00293d36-0c18-4d79-aacd-4224045ff895","Type":"ContainerDied","Data":"702d34f2d07c6bd63fc7dbcd22cbe07825238be81729f58170d0b20b26fc3bf9"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.932863 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder9a65-account-delete-cbfmk" event={"ID":"bf1af2cc-24b9-4786-befa-74623fca05f7","Type":"ContainerStarted","Data":"a1872839aff7c6aaf90d00b55620f39375ddd1da68d35e0d41748179ba2ee470"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.945161 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican30bf-account-delete-mwl8m" event={"ID":"1964cf41-49d7-4b0d-ab8b-fbf9b621e359","Type":"ContainerStarted","Data":"1f176ee0624459f5cffe99a925013e3dc2f1324da883fbea30790016109ae3fa"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.973109 4789 generic.go:334] "Generic (PLEG): container finished" podID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerID="f71a167007b51e6b5519402191320729c426676f344ae45a6739ee1603881192" exitCode=143 Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.973187 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbbc99994-cwbpw" event={"ID":"93b0a572-437e-4d15-a74d-e92c0f39c9cc","Type":"ContainerDied","Data":"f71a167007b51e6b5519402191320729c426676f344ae45a6739ee1603881192"} Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.975173 4789 scope.go:117] "RemoveContainer" containerID="f8d1a6dd5068981a14ce87c5e3a5413bba046f93f1f41541bf47903d89cd9291" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.975273 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 07:16:15 crc kubenswrapper[4789]: I1216 07:16:15.991509 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" event={"ID":"24f668c2-651f-48f2-8feb-7faa470c3a19","Type":"ContainerStarted","Data":"50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.017034 4789 generic.go:334] "Generic (PLEG): container finished" podID="b5429404-d973-4580-961a-8ad6081e93ec" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" exitCode=0 Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.017126 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tblns" event={"ID":"b5429404-d973-4580-961a-8ad6081e93ec","Type":"ContainerDied","Data":"b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.026394 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" event={"ID":"0ff4de9f-c7d4-4d77-81be-7a499ead0f10","Type":"ContainerDied","Data":"9611a31b61e3c622dafebea51d27dbf4d41aaed7f4df133158e79d4bda05d9fd"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.026503 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcd6f8f8f-mgsqq" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.032168 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.032290 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc7s9\" (UniqueName: \"kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9\") pod \"barbican-api-7b855dbb8b-d8wxq\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.035150 4789 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.035789 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:20.03577344 +0000 UTC m=+1518.297661069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : secret "barbican-config-data" not found Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.038291 4789 projected.go:194] Error preparing data for projected volume kube-api-access-nc7s9 for pod openstack/barbican-api-7b855dbb8b-d8wxq: failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.038340 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9 podName:fe8a84b3-ac5c-4ac8-a302-591548a970dd nodeName:}" failed. No retries permitted until 2025-12-16 07:16:20.038324672 +0000 UTC m=+1518.300212291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nc7s9" (UniqueName: "kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9") pod "barbican-api-7b855dbb8b-d8wxq" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd") : failed to fetch token: serviceaccounts "barbican-barbican" not found Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.041959 4789 generic.go:334] "Generic (PLEG): container finished" podID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerID="38cb122304e1fdaa474b979a230d2c0c3d7a6825b41760a62e2b6923a3837070" exitCode=143 Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.042041 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" event={"ID":"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0","Type":"ContainerDied","Data":"38cb122304e1fdaa474b979a230d2c0c3d7a6825b41760a62e2b6923a3837070"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.057056 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifc07-account-delete-bxbmv" event={"ID":"0491a70b-b044-4ec4-b179-778967cd4573","Type":"ContainerStarted","Data":"0930e632f5911a19af40b37bfd265349ca7f713b1462c8df6cc8e79a6d764317"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.095299 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04bcc-account-delete-wzgv2" event={"ID":"fdea835b-f122-4db5-b7c1-ca180d9f3853","Type":"ContainerStarted","Data":"7a9499badf06cab15799c2847dee27997b0abd458005dd60c0bd4bf7c8e78b2b"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.097964 4789 generic.go:334] "Generic (PLEG): container finished" podID="f00adc24-beed-43df-95a8-274b841d60a0" containerID="305b96e7f12f126be8501fb24906a6e570466b444e3bf1c2ee9a66e30d5add7f" exitCode=143 Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.098032 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-864d99d789-mv5rh" event={"ID":"f00adc24-beed-43df-95a8-274b841d60a0","Type":"ContainerDied","Data":"305b96e7f12f126be8501fb24906a6e570466b444e3bf1c2ee9a66e30d5add7f"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.099613 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccb68857-5qpdn" event={"ID":"c5bd2649-9508-49bb-833e-7239b7d11d78","Type":"ContainerStarted","Data":"02f3a045768469bcb82812149404958f11cc790ef72b1c282478eb934a9bafb0"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.101431 4789 generic.go:334] "Generic (PLEG): container finished" podID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerID="eddf93e3cc7bc715cd0565a58d08d71f72cabb8839a08286cf1924335b971dde" exitCode=0 Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.101486 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f699c71b-1e44-4a4d-b1fb-77ef105af03d","Type":"ContainerDied","Data":"eddf93e3cc7bc715cd0565a58d08d71f72cabb8839a08286cf1924335b971dde"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.103582 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.104119 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5298-account-delete-lmmgd" event={"ID":"82acf941-5ce6-4e18-bc6d-1809296622eb","Type":"ContainerStarted","Data":"cc8cad5792de3024a72ca536635b38e2a98f99e729004df0773d87824e5dd86e"} Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.138322 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" path="/var/lib/kubelet/pods/2fceb99a-9dfd-4d79-a0fd-666390de4440/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.139096 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfbba1f-9f1d-4994-b831-e6fd0d7d9826" path="/var/lib/kubelet/pods/5cfbba1f-9f1d-4994-b831-e6fd0d7d9826/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.139879 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" path="/var/lib/kubelet/pods/6456012f-c7be-458c-a9a5-b3958ae72c2c/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.141313 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978" path="/var/lib/kubelet/pods/a0c7fab5-e2bd-4a68-9a72-3eeecb4ce978/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.142318 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6e332d-a38d-4ec6-b875-aad75c5491f4" path="/var/lib/kubelet/pods/ce6e332d-a38d-4ec6-b875-aad75c5491f4/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.143101 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23503f0-7f00-4d2d-830b-fed7db6e6a08" path="/var/lib/kubelet/pods/e23503f0-7f00-4d2d-830b-fed7db6e6a08/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.144561 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48c05ec-30ab-4ea1-a542-35bf74481375" path="/var/lib/kubelet/pods/e48c05ec-30ab-4ea1-a542-35bf74481375/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.145185 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e72bc32c-5282-4477-9bc0-450e94561956" path="/var/lib/kubelet/pods/e72bc32c-5282-4477-9bc0-450e94561956/volumes" Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.340762 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.340848 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data podName:9452e1b2-42ec-47b6-96e1-2770c9e76db2 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:20.340830301 +0000 UTC m=+1518.602717930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data") pod "rabbitmq-cell1-server-0" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.414963 4789 scope.go:117] "RemoveContainer" containerID="1592998b94c2a7232306fe54bf6cc98c4eff08132169b26bf5e601d8973f4fa0" Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.442683 4789 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 16 07:16:16 crc kubenswrapper[4789]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-16T07:16:14Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 16 07:16:16 crc kubenswrapper[4789]: /etc/init.d/functions: line 589: 442 Alarm clock "$@" Dec 16 07:16:16 crc kubenswrapper[4789]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-cw7z9" message=< Dec 16 07:16:16 crc kubenswrapper[4789]: Exiting ovn-controller (1) [FAILED] Dec 16 07:16:16 crc kubenswrapper[4789]: Killing ovn-controller (1) [ OK ] Dec 16 07:16:16 crc kubenswrapper[4789]: 2025-12-16T07:16:14Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 16 07:16:16 crc kubenswrapper[4789]: /etc/init.d/functions: line 589: 442 Alarm clock "$@" Dec 16 07:16:16 crc kubenswrapper[4789]: > Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.442722 4789 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 16 07:16:16 crc kubenswrapper[4789]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-16T07:16:14Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 16 07:16:16 crc kubenswrapper[4789]: /etc/init.d/functions: line 589: 442 Alarm clock "$@" Dec 16 07:16:16 crc kubenswrapper[4789]: > pod="openstack/ovn-controller-cw7z9" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" containerID="cri-o://523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.442811 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-cw7z9" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" containerID="cri-o://523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83" gracePeriod=27 Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.530849 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.571238 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placementdcc5-account-delete-bdqp9" podStartSLOduration=5.571211405 podStartE2EDuration="5.571211405s" podCreationTimestamp="2025-12-16 07:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:16:15.942757118 +0000 UTC m=+1514.204644767" watchObservedRunningTime="2025-12-16 07:16:16.571211405 +0000 UTC m=+1514.833099034" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.591098 4789 scope.go:117] "RemoveContainer" containerID="c3e023a9afe9c5691200c93fa3115544e6576e412ec57a86b25b5ebaba43ebb1" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.607633 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.614181 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-mgsqq"] Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.615115 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.621664 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcd6f8f8f-mgsqq"] Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.645568 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-combined-ca-bundle\") pod \"00293d36-0c18-4d79-aacd-4224045ff895\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.646410 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msn76\" (UniqueName: \"kubernetes.io/projected/00293d36-0c18-4d79-aacd-4224045ff895-kube-api-access-msn76\") pod \"00293d36-0c18-4d79-aacd-4224045ff895\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.646603 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-nova-novncproxy-tls-certs\") pod \"00293d36-0c18-4d79-aacd-4224045ff895\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.646723 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-config-data\") pod \"00293d36-0c18-4d79-aacd-4224045ff895\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.646867 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-vencrypt-tls-certs\") pod \"00293d36-0c18-4d79-aacd-4224045ff895\" (UID: \"00293d36-0c18-4d79-aacd-4224045ff895\") " Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.656383 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c is running failed: container process not found" containerID="7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.685573 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c is running failed: container process not found" containerID="7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.687385 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c is running failed: container process not found" containerID="7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:16 crc kubenswrapper[4789]: E1216 07:16:16.687542 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="358d8958-a563-407c-8b7f-75aee19a3699" containerName="nova-cell0-conductor-conductor" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.693164 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00293d36-0c18-4d79-aacd-4224045ff895-kube-api-access-msn76" (OuterVolumeSpecName: "kube-api-access-msn76") pod "00293d36-0c18-4d79-aacd-4224045ff895" (UID: "00293d36-0c18-4d79-aacd-4224045ff895"). InnerVolumeSpecName "kube-api-access-msn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.697743 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.697855 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749318 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8a84b3-ac5c-4ac8-a302-591548a970dd-logs\") pod \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749368 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-combined-ca-bundle\") pod \"de637363-990a-4590-b9c5-ab66c18ec270\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749408 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cct4q\" (UniqueName: \"kubernetes.io/projected/de637363-990a-4590-b9c5-ab66c18ec270-kube-api-access-cct4q\") pod \"de637363-990a-4590-b9c5-ab66c18ec270\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749457 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-public-tls-certs\") pod \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749523 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-internal-tls-certs\") pod \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749562 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data-custom\") pod \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749590 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data-custom\") pod \"de637363-990a-4590-b9c5-ab66c18ec270\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749614 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data\") pod \"de637363-990a-4590-b9c5-ab66c18ec270\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749690 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-combined-ca-bundle\") pod \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\" (UID: \"fe8a84b3-ac5c-4ac8-a302-591548a970dd\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.749717 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-scripts\") pod \"de637363-990a-4590-b9c5-ab66c18ec270\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.750439 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de637363-990a-4590-b9c5-ab66c18ec270-etc-machine-id\") pod \"de637363-990a-4590-b9c5-ab66c18ec270\" (UID: \"de637363-990a-4590-b9c5-ab66c18ec270\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.752005 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msn76\" (UniqueName: \"kubernetes.io/projected/00293d36-0c18-4d79-aacd-4224045ff895-kube-api-access-msn76\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.752082 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de637363-990a-4590-b9c5-ab66c18ec270-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "de637363-990a-4590-b9c5-ab66c18ec270" (UID: "de637363-990a-4590-b9c5-ab66c18ec270"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.752431 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8a84b3-ac5c-4ac8-a302-591548a970dd-logs" (OuterVolumeSpecName: "logs") pod "fe8a84b3-ac5c-4ac8-a302-591548a970dd" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.776020 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fe8a84b3-ac5c-4ac8-a302-591548a970dd" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.777416 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe8a84b3-ac5c-4ac8-a302-591548a970dd" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.781585 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.808650 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de637363-990a-4590-b9c5-ab66c18ec270-kube-api-access-cct4q" (OuterVolumeSpecName: "kube-api-access-cct4q") pod "de637363-990a-4590-b9c5-ab66c18ec270" (UID: "de637363-990a-4590-b9c5-ab66c18ec270"). InnerVolumeSpecName "kube-api-access-cct4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.809017 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe8a84b3-ac5c-4ac8-a302-591548a970dd" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.809125 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-scripts" (OuterVolumeSpecName: "scripts") pod "de637363-990a-4590-b9c5-ab66c18ec270" (UID: "de637363-990a-4590-b9c5-ab66c18ec270"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.809620 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fe8a84b3-ac5c-4ac8-a302-591548a970dd" (UID: "fe8a84b3-ac5c-4ac8-a302-591548a970dd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.810363 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "de637363-990a-4590-b9c5-ab66c18ec270" (UID: "de637363-990a-4590-b9c5-ab66c18ec270"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.852756 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-generated\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.853611 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855588 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsgtr\" (UniqueName: \"kubernetes.io/projected/358d8958-a563-407c-8b7f-75aee19a3699-kube-api-access-tsgtr\") pod \"358d8958-a563-407c-8b7f-75aee19a3699\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855631 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-internal-tls-certs\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855691 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-config-data\") pod \"358d8958-a563-407c-8b7f-75aee19a3699\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855726 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ncrb\" (UniqueName: \"kubernetes.io/projected/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kube-api-access-6ncrb\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855811 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-galera-tls-certs\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855844 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-config-data\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855892 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kolla-config\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855962 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-combined-ca-bundle\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.855990 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-combined-ca-bundle\") pod \"358d8958-a563-407c-8b7f-75aee19a3699\" (UID: \"358d8958-a563-407c-8b7f-75aee19a3699\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856013 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856056 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-run-httpd\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856085 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-default\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856153 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7stw\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-kube-api-access-t7stw\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856208 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-combined-ca-bundle\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856238 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-log-httpd\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856269 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-public-tls-certs\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-operator-scripts\") pod \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\" (UID: \"f699c71b-1e44-4a4d-b1fb-77ef105af03d\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.856331 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift\") pod \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\" (UID: \"fc02bf7e-2d67-40a4-94b0-5807631a5b2e\") " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857150 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857176 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de637363-990a-4590-b9c5-ab66c18ec270-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857191 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8a84b3-ac5c-4ac8-a302-591548a970dd-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857204 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cct4q\" (UniqueName: \"kubernetes.io/projected/de637363-990a-4590-b9c5-ab66c18ec270-kube-api-access-cct4q\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857217 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857230 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857243 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857254 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857268 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.857280 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.863995 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.864013 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.865196 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.865755 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.866402 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.867549 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00293d36-0c18-4d79-aacd-4224045ff895" (UID: "00293d36-0c18-4d79-aacd-4224045ff895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.867615 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358d8958-a563-407c-8b7f-75aee19a3699-kube-api-access-tsgtr" (OuterVolumeSpecName: "kube-api-access-tsgtr") pod "358d8958-a563-407c-8b7f-75aee19a3699" (UID: "358d8958-a563-407c-8b7f-75aee19a3699"). InnerVolumeSpecName "kube-api-access-tsgtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.874118 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.876151 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kube-api-access-6ncrb" (OuterVolumeSpecName: "kube-api-access-6ncrb") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "kube-api-access-6ncrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.881719 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-kube-api-access-t7stw" (OuterVolumeSpecName: "kube-api-access-t7stw") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "kube-api-access-t7stw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.906870 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.958868 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.958900 4789 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.958928 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsgtr\" (UniqueName: \"kubernetes.io/projected/358d8958-a563-407c-8b7f-75aee19a3699-kube-api-access-tsgtr\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.958942 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ncrb\" (UniqueName: \"kubernetes.io/projected/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kube-api-access-6ncrb\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.958954 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.958983 4789 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.959003 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.959012 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.959021 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f699c71b-1e44-4a4d-b1fb-77ef105af03d-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.959030 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7stw\" (UniqueName: \"kubernetes.io/projected/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-kube-api-access-t7stw\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:16 crc kubenswrapper[4789]: I1216 07:16:16.959039 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.050485 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83 is running failed: container process not found" containerID="523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.054012 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83 is running failed: container process not found" containerID="523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.055004 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83 is running failed: container process not found" containerID="523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.055055 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-cw7z9" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.120737 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"de637363-990a-4590-b9c5-ab66c18ec270","Type":"ContainerDied","Data":"226f8c67ed7aa831b0b83d7a7a57da486ad5e31f13bab83cb64f67e17d72ce11"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.120787 4789 scope.go:117] "RemoveContainer" containerID="1fd3bff06a6b8d682fd662de9fa43cca1d63dd71d4ca1dc0b4dd34b7b40fb7d8" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.120882 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.139654 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder9a65-account-delete-cbfmk" event={"ID":"bf1af2cc-24b9-4786-befa-74623fca05f7","Type":"ContainerStarted","Data":"d60353fa5ba849ea6e4e0c3d7ef76b89ce84019c3a133410894721c746ae9ea9"} Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.160056 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.163879 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.176165 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.176343 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.176364 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.189861 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04bcc-account-delete-wzgv2" event={"ID":"fdea835b-f122-4db5-b7c1-ca180d9f3853","Type":"ContainerStarted","Data":"219fbc2ee038a6ec79a5c2ed6cf3962635b1244065178fb8a4e9bde8190c278e"} Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.194243 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.196030 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder9a65-account-delete-cbfmk" podStartSLOduration=6.196009773 podStartE2EDuration="6.196009773s" podCreationTimestamp="2025-12-16 07:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:16:17.18645586 +0000 UTC m=+1515.448343489" watchObservedRunningTime="2025-12-16 07:16:17.196009773 +0000 UTC m=+1515.457897402" Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.203971 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:17 crc kubenswrapper[4789]: E1216 07:16:17.204035 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.225325 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00293d36-0c18-4d79-aacd-4224045ff895","Type":"ContainerDied","Data":"a46dbca22a6efeef6310c97c3fda17650a2c7dc6c2a22977565a3c3b44654984"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.225467 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.273687 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.274063 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f699c71b-1e44-4a4d-b1fb-77ef105af03d","Type":"ContainerDied","Data":"e3b3746f35fb16f83fe0120e8d75eb4fa7a73e27440d42530fadd50fe2af8eae"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.299569 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.164:8776/healthcheck\": read tcp 10.217.0.2:35494->10.217.0.164:8776: read: connection reset by peer" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.300026 4789 generic.go:334] "Generic (PLEG): container finished" podID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerID="db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac" exitCode=0 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.300043 4789 generic.go:334] "Generic (PLEG): container finished" podID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerID="f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0" exitCode=0 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.300104 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7b9cd85-4tf54" event={"ID":"fc02bf7e-2d67-40a4-94b0-5807631a5b2e","Type":"ContainerDied","Data":"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.300130 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7b9cd85-4tf54" event={"ID":"fc02bf7e-2d67-40a4-94b0-5807631a5b2e","Type":"ContainerDied","Data":"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.300143 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7b9cd85-4tf54" event={"ID":"fc02bf7e-2d67-40a4-94b0-5807631a5b2e","Type":"ContainerDied","Data":"71486eafa8ee116f27520b440d4c2fe007efa0cf3a0d2c0fb82ad3f2d77810e5"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.300212 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7b9cd85-4tf54" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.317463 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:35634->10.217.0.203:8775: read: connection reset by peer" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.317586 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:35624->10.217.0.203:8775: read: connection reset by peer" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.325542 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" event={"ID":"24f668c2-651f-48f2-8feb-7faa470c3a19","Type":"ContainerStarted","Data":"b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.325701 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener-log" containerID="cri-o://50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.326169 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener" containerID="cri-o://b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.337557 4789 generic.go:334] "Generic (PLEG): container finished" podID="358d8958-a563-407c-8b7f-75aee19a3699" containerID="7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" exitCode=0 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.337664 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"358d8958-a563-407c-8b7f-75aee19a3699","Type":"ContainerDied","Data":"7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.337697 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"358d8958-a563-407c-8b7f-75aee19a3699","Type":"ContainerDied","Data":"3d928f0f544dc5675e5d11d3ec179405a3a23b2e72e60efe823b2788d8fbe6ad"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.337774 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.360439 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.365823 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-cw7z9_1e16a3ef-920e-493a-ae2f-7336d64bbd7e/ovn-controller/0.log" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.365873 4789 generic.go:334] "Generic (PLEG): container finished" podID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerID="523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83" exitCode=143 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.365956 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9" event={"ID":"1e16a3ef-920e-493a-ae2f-7336d64bbd7e","Type":"ContainerDied","Data":"523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.365988 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cw7z9" event={"ID":"1e16a3ef-920e-493a-ae2f-7336d64bbd7e","Type":"ContainerDied","Data":"5dbcf4faa066d42f7701a1f9fed6538be3aac4a93eaef6d495518a71ebff75bb"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.366003 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dbcf4faa066d42f7701a1f9fed6538be3aac4a93eaef6d495518a71ebff75bb" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.368020 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccb68857-5qpdn" event={"ID":"c5bd2649-9508-49bb-833e-7239b7d11d78","Type":"ContainerStarted","Data":"cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.368051 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccb68857-5qpdn" event={"ID":"c5bd2649-9508-49bb-833e-7239b7d11d78","Type":"ContainerStarted","Data":"f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.368180 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6ccb68857-5qpdn" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker-log" containerID="cri-o://f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.368643 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6ccb68857-5qpdn" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker" containerID="cri-o://cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.371236 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.390279 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell04bcc-account-delete-wzgv2" podStartSLOduration=5.390256788 podStartE2EDuration="5.390256788s" podCreationTimestamp="2025-12-16 07:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:16:17.230191304 +0000 UTC m=+1515.492078933" watchObservedRunningTime="2025-12-16 07:16:17.390256788 +0000 UTC m=+1515.652144407" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.391438 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" podStartSLOduration=6.391429526 podStartE2EDuration="6.391429526s" podCreationTimestamp="2025-12-16 07:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:16:17.351037634 +0000 UTC m=+1515.612925263" watchObservedRunningTime="2025-12-16 07:16:17.391429526 +0000 UTC m=+1515.653317155" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.399705 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.423580 4789 generic.go:334] "Generic (PLEG): container finished" podID="24b89e30-7a68-4c02-8386-cc104108a8ea" containerID="8bbc52e42bff37f691a5171e008b43120b3eb76895f170e5be9ef714237b885f" exitCode=0 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.423653 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b855dbb8b-d8wxq" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.427125 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementdcc5-account-delete-bdqp9" event={"ID":"24b89e30-7a68-4c02-8386-cc104108a8ea","Type":"ContainerDied","Data":"8bbc52e42bff37f691a5171e008b43120b3eb76895f170e5be9ef714237b885f"} Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.422651 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6ccb68857-5qpdn" podStartSLOduration=6.422633306 podStartE2EDuration="6.422633306s" podCreationTimestamp="2025-12-16 07:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:16:17.4220246 +0000 UTC m=+1515.683912229" watchObservedRunningTime="2025-12-16 07:16:17.422633306 +0000 UTC m=+1515.684520935" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.469836 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "358d8958-a563-407c-8b7f-75aee19a3699" (UID: "358d8958-a563-407c-8b7f-75aee19a3699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.477383 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-config-data" (OuterVolumeSpecName: "config-data") pod "00293d36-0c18-4d79-aacd-4224045ff895" (UID: "00293d36-0c18-4d79-aacd-4224045ff895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.476152 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.477801 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.513043 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "00293d36-0c18-4d79-aacd-4224045ff895" (UID: "00293d36-0c18-4d79-aacd-4224045ff895"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.551007 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "00293d36-0c18-4d79-aacd-4224045ff895" (UID: "00293d36-0c18-4d79-aacd-4224045ff895"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.551228 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-config-data" (OuterVolumeSpecName: "config-data") pod "358d8958-a563-407c-8b7f-75aee19a3699" (UID: "358d8958-a563-407c-8b7f-75aee19a3699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.584738 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.584768 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358d8958-a563-407c-8b7f-75aee19a3699-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.584777 4789 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.584788 4789 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00293d36-0c18-4d79-aacd-4224045ff895-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.643870 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.651286 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.666055 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de637363-990a-4590-b9c5-ab66c18ec270" (UID: "de637363-990a-4590-b9c5-ab66c18ec270"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.666292 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f699c71b-1e44-4a4d-b1fb-77ef105af03d" (UID: "f699c71b-1e44-4a4d-b1fb-77ef105af03d"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.680348 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.687584 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.687634 4789 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f699c71b-1e44-4a4d-b1fb-77ef105af03d-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.687644 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.687652 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.687663 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.746214 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.746631 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-central-agent" containerID="cri-o://9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.747193 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="sg-core" containerID="cri-o://c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.747223 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="proxy-httpd" containerID="cri-o://231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.747269 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-notification-agent" containerID="cri-o://02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.755126 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bbbc99994-cwbpw" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:58882->10.217.0.155:9311: read: connection reset by peer" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.755277 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bbbc99994-cwbpw" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:58880->10.217.0.155:9311: read: connection reset by peer" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.759138 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-config-data" (OuterVolumeSpecName: "config-data") pod "fc02bf7e-2d67-40a4-94b0-5807631a5b2e" (UID: "fc02bf7e-2d67-40a4-94b0-5807631a5b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.790490 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc02bf7e-2d67-40a4-94b0-5807631a5b2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.888612 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.895453 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="29706741-1258-454c-968f-836e472cb685" containerName="kube-state-metrics" containerID="cri-o://a0a363829296ba32ca72eb47110c7682a3a5f2c237669efd7a55541ecf92e1ab" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.896118 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data" (OuterVolumeSpecName: "config-data") pod "de637363-990a-4590-b9c5-ab66c18ec270" (UID: "de637363-990a-4590-b9c5-ab66c18ec270"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.906087 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.906310 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="db3b1c91-5558-4afb-a9fc-dd75527451ee" containerName="memcached" containerID="cri-o://530a5d65e6717bf5cb2f7088a0efb5a0b81a55b156e61bc16cea37ce6d84bbe3" gracePeriod=30 Dec 16 07:16:17 crc kubenswrapper[4789]: I1216 07:16:17.996059 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de637363-990a-4590-b9c5-ab66c18ec270-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.008107 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vm9j2"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.029049 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vm9j2"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.036418 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jlc8k"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.044972 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jlc8k"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.052304 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.057754 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6789db9888-57dmq"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.057979 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6789db9888-57dmq" podUID="254f667d-eae3-486b-b9e8-ffc571d65635" containerName="keystone-api" containerID="cri-o://d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269" gracePeriod=30 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.091086 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-sm8ns"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.104710 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-sm8ns"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.128688 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d5548c-14fe-416e-86d8-f6845cbcc57c" path="/var/lib/kubelet/pods/08d5548c-14fe-416e-86d8-f6845cbcc57c/volumes" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.129345 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" path="/var/lib/kubelet/pods/0ff4de9f-c7d4-4d77-81be-7a499ead0f10/volumes" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.130061 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0" path="/var/lib/kubelet/pods/2fafc8a0-5a6a-4889-8abe-dd6e1ec1d0c0/volumes" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.131321 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deca8e15-b233-4a6c-bc1e-06494fca64bb" path="/var/lib/kubelet/pods/deca8e15-b233-4a6c-bc1e-06494fca64bb/volumes" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.132944 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c869-account-create-update-dwt2f"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.132971 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c869-account-create-update-dwt2f"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.380316 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerName="galera" containerID="cri-o://08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d" gracePeriod=30 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.444711 4789 generic.go:334] "Generic (PLEG): container finished" podID="f00adc24-beed-43df-95a8-274b841d60a0" containerID="0f48a94f282f9b6fdc9bae2f55acd33cf0b6397237ba430e41c42e3e2660b0b4" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.444781 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-864d99d789-mv5rh" event={"ID":"f00adc24-beed-43df-95a8-274b841d60a0","Type":"ContainerDied","Data":"0f48a94f282f9b6fdc9bae2f55acd33cf0b6397237ba430e41c42e3e2660b0b4"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.446472 4789 generic.go:334] "Generic (PLEG): container finished" podID="7f2338e7-2de7-4149-bb6a-ae978c7e096a" containerID="94f733234febab44be66c3a6362c59850bc8bebe5864b744fd69d4b571a3292f" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.446531 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9ba2-account-delete-tgp7b" event={"ID":"7f2338e7-2de7-4149-bb6a-ae978c7e096a","Type":"ContainerDied","Data":"94f733234febab44be66c3a6362c59850bc8bebe5864b744fd69d4b571a3292f"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.449352 4789 generic.go:334] "Generic (PLEG): container finished" podID="29706741-1258-454c-968f-836e472cb685" containerID="a0a363829296ba32ca72eb47110c7682a3a5f2c237669efd7a55541ecf92e1ab" exitCode=2 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.449409 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29706741-1258-454c-968f-836e472cb685","Type":"ContainerDied","Data":"a0a363829296ba32ca72eb47110c7682a3a5f2c237669efd7a55541ecf92e1ab"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.469206 4789 generic.go:334] "Generic (PLEG): container finished" podID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerID="17038ebe8bb333b1b73372f442dc14a740d2ad41822921ccc7adc857ef4a9c8b" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.469281 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84","Type":"ContainerDied","Data":"17038ebe8bb333b1b73372f442dc14a740d2ad41822921ccc7adc857ef4a9c8b"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.473879 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican30bf-account-delete-mwl8m" event={"ID":"1964cf41-49d7-4b0d-ab8b-fbf9b621e359","Type":"ContainerStarted","Data":"c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.474475 4789 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican30bf-account-delete-mwl8m" secret="" err="secret \"galera-openstack-dockercfg-5nxcs\" not found" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.491935 4789 generic.go:334] "Generic (PLEG): container finished" podID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerID="063de08125ee34cdd307d91af26a93c26622ecd506fc1ef247a55845563c91d4" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.492045 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbbc99994-cwbpw" event={"ID":"93b0a572-437e-4d15-a74d-e92c0f39c9cc","Type":"ContainerDied","Data":"063de08125ee34cdd307d91af26a93c26622ecd506fc1ef247a55845563c91d4"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.494334 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifc07-account-delete-bxbmv" event={"ID":"0491a70b-b044-4ec4-b179-778967cd4573","Type":"ContainerStarted","Data":"e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.496034 4789 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapifc07-account-delete-bxbmv" secret="" err="secret \"galera-openstack-dockercfg-5nxcs\" not found" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.498129 4789 generic.go:334] "Generic (PLEG): container finished" podID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerID="50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a" exitCode=143 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.498226 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" event={"ID":"24f668c2-651f-48f2-8feb-7faa470c3a19","Type":"ContainerDied","Data":"50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.501905 4789 generic.go:334] "Generic (PLEG): container finished" podID="8368d044-b088-48f9-b5cb-19a95b997576" containerID="f58f590eff39129dc3fd6cbf997894d78a3061978019b25eafe3b8d013aa5949" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.501980 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b48fd45b4-hp2xw" event={"ID":"8368d044-b088-48f9-b5cb-19a95b997576","Type":"ContainerDied","Data":"f58f590eff39129dc3fd6cbf997894d78a3061978019b25eafe3b8d013aa5949"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.505660 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf1af2cc-24b9-4786-befa-74623fca05f7" containerID="d60353fa5ba849ea6e4e0c3d7ef76b89ce84019c3a133410894721c746ae9ea9" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.505718 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder9a65-account-delete-cbfmk" event={"ID":"bf1af2cc-24b9-4786-befa-74623fca05f7","Type":"ContainerDied","Data":"d60353fa5ba849ea6e4e0c3d7ef76b89ce84019c3a133410894721c746ae9ea9"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.508882 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican30bf-account-delete-mwl8m" podStartSLOduration=6.508864939 podStartE2EDuration="6.508864939s" podCreationTimestamp="2025-12-16 07:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:16:18.488277928 +0000 UTC m=+1516.750165557" watchObservedRunningTime="2025-12-16 07:16:18.508864939 +0000 UTC m=+1516.770752568" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.520360 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapifc07-account-delete-bxbmv" podStartSLOduration=6.520340488 podStartE2EDuration="6.520340488s" podCreationTimestamp="2025-12-16 07:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:16:18.51098588 +0000 UTC m=+1516.772873499" watchObservedRunningTime="2025-12-16 07:16:18.520340488 +0000 UTC m=+1516.782228117" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.520435 4789 generic.go:334] "Generic (PLEG): container finished" podID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerID="2339da9c2a0e96f20cb674ffa23240a66f38f9520dccf9e45eba3c48effd6316" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.520460 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" event={"ID":"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0","Type":"ContainerDied","Data":"2339da9c2a0e96f20cb674ffa23240a66f38f9520dccf9e45eba3c48effd6316"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.541278 4789 generic.go:334] "Generic (PLEG): container finished" podID="fdea835b-f122-4db5-b7c1-ca180d9f3853" containerID="219fbc2ee038a6ec79a5c2ed6cf3962635b1244065178fb8a4e9bde8190c278e" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.541363 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04bcc-account-delete-wzgv2" event={"ID":"fdea835b-f122-4db5-b7c1-ca180d9f3853","Type":"ContainerDied","Data":"219fbc2ee038a6ec79a5c2ed6cf3962635b1244065178fb8a4e9bde8190c278e"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.551874 4789 generic.go:334] "Generic (PLEG): container finished" podID="82acf941-5ce6-4e18-bc6d-1809296622eb" containerID="e1eca940464b8ae3b391be41652790cb6e58d7ba7d126643ddbe8f964dc950c4" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.552010 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5298-account-delete-lmmgd" event={"ID":"82acf941-5ce6-4e18-bc6d-1809296622eb","Type":"ContainerDied","Data":"e1eca940464b8ae3b391be41652790cb6e58d7ba7d126643ddbe8f964dc950c4"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.570498 4789 generic.go:334] "Generic (PLEG): container finished" podID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerID="dfe47974cb64535408aeb67063f1a4814aa8aaad5cefa3463823fe0dd085e7b6" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.570557 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5","Type":"ContainerDied","Data":"dfe47974cb64535408aeb67063f1a4814aa8aaad5cefa3463823fe0dd085e7b6"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.593448 4789 generic.go:334] "Generic (PLEG): container finished" podID="1894718e-3dac-4430-9285-e397fb21e852" containerID="231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.593489 4789 generic.go:334] "Generic (PLEG): container finished" podID="1894718e-3dac-4430-9285-e397fb21e852" containerID="c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6" exitCode=2 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.593544 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerDied","Data":"231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.593573 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerDied","Data":"c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.598147 4789 generic.go:334] "Generic (PLEG): container finished" podID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerID="79b3fcff6d02b1d1105cdaa7d49563ca416afc3d0d209ae94eae9fd336eca759" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.598196 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6423ab7-79a3-402c-9115-e54b5f29ad05","Type":"ContainerDied","Data":"79b3fcff6d02b1d1105cdaa7d49563ca416afc3d0d209ae94eae9fd336eca759"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.599834 4789 generic.go:334] "Generic (PLEG): container finished" podID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerID="f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1" exitCode=143 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.600079 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccb68857-5qpdn" event={"ID":"c5bd2649-9508-49bb-833e-7239b7d11d78","Type":"ContainerDied","Data":"f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.601342 4789 generic.go:334] "Generic (PLEG): container finished" podID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerID="eea19a9862fbb3d803e89b2d7c5b8ef5c9fd9bd0d293359d33b0d30a3cecccac" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.601382 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5031d0ac-42ac-4346-9403-0369a555ab4a","Type":"ContainerDied","Data":"eea19a9862fbb3d803e89b2d7c5b8ef5c9fd9bd0d293359d33b0d30a3cecccac"} Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.603259 4789 generic.go:334] "Generic (PLEG): container finished" podID="37216df1-3f61-412b-bffb-5e36812383f4" containerID="82b67d2f0b7d827d390a1737f28832c66819bfb58a92aae85e467319754e80a4" exitCode=0 Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.603490 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37216df1-3f61-412b-bffb-5e36812383f4","Type":"ContainerDied","Data":"82b67d2f0b7d827d390a1737f28832c66819bfb58a92aae85e467319754e80a4"} Dec 16 07:16:18 crc kubenswrapper[4789]: E1216 07:16:18.621789 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:18 crc kubenswrapper[4789]: E1216 07:16:18.621849 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts podName:0491a70b-b044-4ec4-b179-778967cd4573 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:19.121834156 +0000 UTC m=+1517.383721785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts") pod "novaapifc07-account-delete-bxbmv" (UID: "0491a70b-b044-4ec4-b179-778967cd4573") : configmap "openstack-scripts" not found Dec 16 07:16:18 crc kubenswrapper[4789]: E1216 07:16:18.622035 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:18 crc kubenswrapper[4789]: E1216 07:16:18.622084 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts podName:1964cf41-49d7-4b0d-ab8b-fbf9b621e359 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:19.122068882 +0000 UTC m=+1517.383956511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts") pod "barbican30bf-account-delete-mwl8m" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359") : configmap "openstack-scripts" not found Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.686779 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-cw7z9_1e16a3ef-920e-493a-ae2f-7336d64bbd7e/ovn-controller/0.log" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.687130 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.711514 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.722589 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-log-ovn\") pod \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.722686 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-ovn-controller-tls-certs\") pod \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.722741 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run\") pod \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.722776 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-combined-ca-bundle\") pod \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.722829 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run-ovn\") pod \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.722868 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-scripts\") pod \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.722937 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5q99\" (UniqueName: \"kubernetes.io/projected/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-kube-api-access-v5q99\") pod \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\" (UID: \"1e16a3ef-920e-493a-ae2f-7336d64bbd7e\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.724879 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run" (OuterVolumeSpecName: "var-run") pod "1e16a3ef-920e-493a-ae2f-7336d64bbd7e" (UID: "1e16a3ef-920e-493a-ae2f-7336d64bbd7e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.724952 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1e16a3ef-920e-493a-ae2f-7336d64bbd7e" (UID: "1e16a3ef-920e-493a-ae2f-7336d64bbd7e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.724981 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1e16a3ef-920e-493a-ae2f-7336d64bbd7e" (UID: "1e16a3ef-920e-493a-ae2f-7336d64bbd7e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.726170 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-scripts" (OuterVolumeSpecName: "scripts") pod "1e16a3ef-920e-493a-ae2f-7336d64bbd7e" (UID: "1e16a3ef-920e-493a-ae2f-7336d64bbd7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.749152 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-kube-api-access-v5q99" (OuterVolumeSpecName: "kube-api-access-v5q99") pod "1e16a3ef-920e-493a-ae2f-7336d64bbd7e" (UID: "1e16a3ef-920e-493a-ae2f-7336d64bbd7e"). InnerVolumeSpecName "kube-api-access-v5q99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.759022 4789 scope.go:117] "RemoveContainer" containerID="7a559de8c4fc233747aed0e14dd0fbf6aa46b087f910b2378d491cf160c0c80e" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.770460 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e16a3ef-920e-493a-ae2f-7336d64bbd7e" (UID: "1e16a3ef-920e-493a-ae2f-7336d64bbd7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.814553 4789 scope.go:117] "RemoveContainer" containerID="702d34f2d07c6bd63fc7dbcd22cbe07825238be81729f58170d0b20b26fc3bf9" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.816817 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.823306 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825030 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwllj\" (UniqueName: \"kubernetes.io/projected/29706741-1258-454c-968f-836e472cb685-kube-api-access-pwllj\") pod \"29706741-1258-454c-968f-836e472cb685\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825140 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-config\") pod \"29706741-1258-454c-968f-836e472cb685\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825232 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-combined-ca-bundle\") pod \"29706741-1258-454c-968f-836e472cb685\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825314 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-certs\") pod \"29706741-1258-454c-968f-836e472cb685\" (UID: \"29706741-1258-454c-968f-836e472cb685\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825678 4789 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825693 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825703 4789 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825711 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825721 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5q99\" (UniqueName: \"kubernetes.io/projected/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-kube-api-access-v5q99\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.825729 4789 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.835699 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.846754 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29706741-1258-454c-968f-836e472cb685-kube-api-access-pwllj" (OuterVolumeSpecName: "kube-api-access-pwllj") pod "29706741-1258-454c-968f-836e472cb685" (UID: "29706741-1258-454c-968f-836e472cb685"). InnerVolumeSpecName "kube-api-access-pwllj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.848110 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.867476 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "29706741-1258-454c-968f-836e472cb685" (UID: "29706741-1258-454c-968f-836e472cb685"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.875228 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b855dbb8b-d8wxq"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.876546 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "1e16a3ef-920e-493a-ae2f-7336d64bbd7e" (UID: "1e16a3ef-920e-493a-ae2f-7336d64bbd7e"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.882356 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b855dbb8b-d8wxq"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.902246 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29706741-1258-454c-968f-836e472cb685" (UID: "29706741-1258-454c-968f-836e472cb685"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926497 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-internal-tls-certs\") pod \"8368d044-b088-48f9-b5cb-19a95b997576\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926547 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-config-data\") pod \"8368d044-b088-48f9-b5cb-19a95b997576\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926586 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-scripts\") pod \"8368d044-b088-48f9-b5cb-19a95b997576\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926639 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data\") pod \"f00adc24-beed-43df-95a8-274b841d60a0\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926681 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-combined-ca-bundle\") pod \"8368d044-b088-48f9-b5cb-19a95b997576\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926712 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data-custom\") pod \"f00adc24-beed-43df-95a8-274b841d60a0\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926794 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-872l5\" (UniqueName: \"kubernetes.io/projected/f00adc24-beed-43df-95a8-274b841d60a0-kube-api-access-872l5\") pod \"f00adc24-beed-43df-95a8-274b841d60a0\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926845 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8368d044-b088-48f9-b5cb-19a95b997576-logs\") pod \"8368d044-b088-48f9-b5cb-19a95b997576\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926873 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-public-tls-certs\") pod \"8368d044-b088-48f9-b5cb-19a95b997576\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.926936 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48lrv\" (UniqueName: \"kubernetes.io/projected/8368d044-b088-48f9-b5cb-19a95b997576-kube-api-access-48lrv\") pod \"8368d044-b088-48f9-b5cb-19a95b997576\" (UID: \"8368d044-b088-48f9-b5cb-19a95b997576\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927036 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00adc24-beed-43df-95a8-274b841d60a0-logs\") pod \"f00adc24-beed-43df-95a8-274b841d60a0\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927067 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-combined-ca-bundle\") pod \"f00adc24-beed-43df-95a8-274b841d60a0\" (UID: \"f00adc24-beed-43df-95a8-274b841d60a0\") " Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927535 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwllj\" (UniqueName: \"kubernetes.io/projected/29706741-1258-454c-968f-836e472cb685-kube-api-access-pwllj\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927559 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e16a3ef-920e-493a-ae2f-7336d64bbd7e-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927572 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc7s9\" (UniqueName: \"kubernetes.io/projected/fe8a84b3-ac5c-4ac8-a302-591548a970dd-kube-api-access-nc7s9\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927585 4789 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927597 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8a84b3-ac5c-4ac8-a302-591548a970dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.927611 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.932733 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f00adc24-beed-43df-95a8-274b841d60a0" (UID: "f00adc24-beed-43df-95a8-274b841d60a0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.932977 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.937649 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-scripts" (OuterVolumeSpecName: "scripts") pod "8368d044-b088-48f9-b5cb-19a95b997576" (UID: "8368d044-b088-48f9-b5cb-19a95b997576"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.937795 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.938679 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00adc24-beed-43df-95a8-274b841d60a0-logs" (OuterVolumeSpecName: "logs") pod "f00adc24-beed-43df-95a8-274b841d60a0" (UID: "f00adc24-beed-43df-95a8-274b841d60a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.942127 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8368d044-b088-48f9-b5cb-19a95b997576-logs" (OuterVolumeSpecName: "logs") pod "8368d044-b088-48f9-b5cb-19a95b997576" (UID: "8368d044-b088-48f9-b5cb-19a95b997576"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.967060 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8368d044-b088-48f9-b5cb-19a95b997576-kube-api-access-48lrv" (OuterVolumeSpecName: "kube-api-access-48lrv") pod "8368d044-b088-48f9-b5cb-19a95b997576" (UID: "8368d044-b088-48f9-b5cb-19a95b997576"). InnerVolumeSpecName "kube-api-access-48lrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.969140 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f7b9cd85-4tf54"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.986559 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7f7b9cd85-4tf54"] Dec 16 07:16:18 crc kubenswrapper[4789]: I1216 07:16:18.997161 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f00adc24-beed-43df-95a8-274b841d60a0-kube-api-access-872l5" (OuterVolumeSpecName: "kube-api-access-872l5") pod "f00adc24-beed-43df-95a8-274b841d60a0" (UID: "f00adc24-beed-43df-95a8-274b841d60a0"). InnerVolumeSpecName "kube-api-access-872l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.000074 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "29706741-1258-454c-968f-836e472cb685" (UID: "29706741-1258-454c-968f-836e472cb685"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.012131 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.022041 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.028757 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00adc24-beed-43df-95a8-274b841d60a0-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.028782 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.028790 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.028799 4789 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29706741-1258-454c-968f-836e472cb685-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.028808 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-872l5\" (UniqueName: \"kubernetes.io/projected/f00adc24-beed-43df-95a8-274b841d60a0-kube-api-access-872l5\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.028817 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8368d044-b088-48f9-b5cb-19a95b997576-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.028825 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48lrv\" (UniqueName: \"kubernetes.io/projected/8368d044-b088-48f9-b5cb-19a95b997576-kube-api-access-48lrv\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.036001 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.062135 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.085858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f00adc24-beed-43df-95a8-274b841d60a0" (UID: "f00adc24-beed-43df-95a8-274b841d60a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.091373 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8368d044-b088-48f9-b5cb-19a95b997576" (UID: "8368d044-b088-48f9-b5cb-19a95b997576"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.124417 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-config-data" (OuterVolumeSpecName: "config-data") pod "8368d044-b088-48f9-b5cb-19a95b997576" (UID: "8368d044-b088-48f9-b5cb-19a95b997576"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.130473 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.130679 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.130739 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.130840 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.131044 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts podName:0491a70b-b044-4ec4-b179-778967cd4573 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:20.131027202 +0000 UTC m=+1518.392914841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts") pod "novaapifc07-account-delete-bxbmv" (UID: "0491a70b-b044-4ec4-b179-778967cd4573") : configmap "openstack-scripts" not found Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.131451 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.131533 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts podName:1964cf41-49d7-4b0d-ab8b-fbf9b621e359 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:20.131516694 +0000 UTC m=+1518.393404323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts") pod "barbican30bf-account-delete-mwl8m" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359") : configmap "openstack-scripts" not found Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.174539 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8368d044-b088-48f9-b5cb-19a95b997576" (UID: "8368d044-b088-48f9-b5cb-19a95b997576"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.179280 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data" (OuterVolumeSpecName: "config-data") pod "f00adc24-beed-43df-95a8-274b841d60a0" (UID: "f00adc24-beed-43df-95a8-274b841d60a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.188457 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8368d044-b088-48f9-b5cb-19a95b997576" (UID: "8368d044-b088-48f9-b5cb-19a95b997576"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.221290 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.232388 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.232427 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00adc24-beed-43df-95a8-274b841d60a0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.232441 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8368d044-b088-48f9-b5cb-19a95b997576-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.240816 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.250694 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.250771 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="ovn-northd" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.290192 4789 scope.go:117] "RemoveContainer" containerID="eddf93e3cc7bc715cd0565a58d08d71f72cabb8839a08286cf1924335b971dde" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.302226 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.305091 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.307506 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.332953 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b0a572-437e-4d15-a74d-e92c0f39c9cc-logs\") pod \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333014 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29plf\" (UniqueName: \"kubernetes.io/projected/93b0a572-437e-4d15-a74d-e92c0f39c9cc-kube-api-access-29plf\") pod \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333037 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xr6l\" (UniqueName: \"kubernetes.io/projected/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-kube-api-access-8xr6l\") pod \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333057 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-combined-ca-bundle\") pod \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333091 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b89e30-7a68-4c02-8386-cc104108a8ea-operator-scripts\") pod \"24b89e30-7a68-4c02-8386-cc104108a8ea\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333107 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-logs\") pod \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333171 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shpn4\" (UniqueName: \"kubernetes.io/projected/24b89e30-7a68-4c02-8386-cc104108a8ea-kube-api-access-shpn4\") pod \"24b89e30-7a68-4c02-8386-cc104108a8ea\" (UID: \"24b89e30-7a68-4c02-8386-cc104108a8ea\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333198 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-public-tls-certs\") pod \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333213 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data\") pod \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333238 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data\") pod \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333273 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data-custom\") pod \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333334 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data-custom\") pod \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333365 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-combined-ca-bundle\") pod \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\" (UID: \"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.333386 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-internal-tls-certs\") pod \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\" (UID: \"93b0a572-437e-4d15-a74d-e92c0f39c9cc\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.339111 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b0a572-437e-4d15-a74d-e92c0f39c9cc-logs" (OuterVolumeSpecName: "logs") pod "93b0a572-437e-4d15-a74d-e92c0f39c9cc" (UID: "93b0a572-437e-4d15-a74d-e92c0f39c9cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.339196 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24b89e30-7a68-4c02-8386-cc104108a8ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24b89e30-7a68-4c02-8386-cc104108a8ea" (UID: "24b89e30-7a68-4c02-8386-cc104108a8ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.344324 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-logs" (OuterVolumeSpecName: "logs") pod "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" (UID: "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.351162 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" (UID: "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.354726 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b89e30-7a68-4c02-8386-cc104108a8ea-kube-api-access-shpn4" (OuterVolumeSpecName: "kube-api-access-shpn4") pod "24b89e30-7a68-4c02-8386-cc104108a8ea" (UID: "24b89e30-7a68-4c02-8386-cc104108a8ea"). InnerVolumeSpecName "kube-api-access-shpn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.368943 4789 scope.go:117] "RemoveContainer" containerID="71816cb1e1dc225cc4c62c55a8064e75aa79bd2df0d364078f17dc7c513e994f" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.370067 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b0a572-437e-4d15-a74d-e92c0f39c9cc-kube-api-access-29plf" (OuterVolumeSpecName: "kube-api-access-29plf") pod "93b0a572-437e-4d15-a74d-e92c0f39c9cc" (UID: "93b0a572-437e-4d15-a74d-e92c0f39c9cc"). InnerVolumeSpecName "kube-api-access-29plf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.383685 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-kube-api-access-8xr6l" (OuterVolumeSpecName: "kube-api-access-8xr6l") pod "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" (UID: "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0"). InnerVolumeSpecName "kube-api-access-8xr6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.388840 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "93b0a572-437e-4d15-a74d-e92c0f39c9cc" (UID: "93b0a572-437e-4d15-a74d-e92c0f39c9cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436359 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24b89e30-7a68-4c02-8386-cc104108a8ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436392 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436403 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shpn4\" (UniqueName: \"kubernetes.io/projected/24b89e30-7a68-4c02-8386-cc104108a8ea-kube-api-access-shpn4\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436412 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436421 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436428 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b0a572-437e-4d15-a74d-e92c0f39c9cc-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436436 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29plf\" (UniqueName: \"kubernetes.io/projected/93b0a572-437e-4d15-a74d-e92c0f39c9cc-kube-api-access-29plf\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.436444 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xr6l\" (UniqueName: \"kubernetes.io/projected/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-kube-api-access-8xr6l\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.446258 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93b0a572-437e-4d15-a74d-e92c0f39c9cc" (UID: "93b0a572-437e-4d15-a74d-e92c0f39c9cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.464078 4789 scope.go:117] "RemoveContainer" containerID="db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.489202 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" (UID: "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.502604 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data" (OuterVolumeSpecName: "config-data") pod "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" (UID: "b6eb716a-c1c8-47f7-bef8-68fbdf9162c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.512010 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data" (OuterVolumeSpecName: "config-data") pod "93b0a572-437e-4d15-a74d-e92c0f39c9cc" (UID: "93b0a572-437e-4d15-a74d-e92c0f39c9cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.520833 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "93b0a572-437e-4d15-a74d-e92c0f39c9cc" (UID: "93b0a572-437e-4d15-a74d-e92c0f39c9cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.521634 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "93b0a572-437e-4d15-a74d-e92c0f39c9cc" (UID: "93b0a572-437e-4d15-a74d-e92c0f39c9cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.532618 4789 scope.go:117] "RemoveContainer" containerID="f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.538906 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.538957 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.538970 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.538986 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.538999 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.539012 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b0a572-437e-4d15-a74d-e92c0f39c9cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.556766 4789 scope.go:117] "RemoveContainer" containerID="db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac" Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.559317 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac\": container with ID starting with db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac not found: ID does not exist" containerID="db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.559375 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac"} err="failed to get container status \"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac\": rpc error: code = NotFound desc = could not find container \"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac\": container with ID starting with db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac not found: ID does not exist" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.559404 4789 scope.go:117] "RemoveContainer" containerID="f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0" Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.559716 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0\": container with ID starting with f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0 not found: ID does not exist" containerID="f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.559738 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0"} err="failed to get container status \"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0\": rpc error: code = NotFound desc = could not find container \"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0\": container with ID starting with f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0 not found: ID does not exist" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.559752 4789 scope.go:117] "RemoveContainer" containerID="db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.560001 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac"} err="failed to get container status \"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac\": rpc error: code = NotFound desc = could not find container \"db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac\": container with ID starting with db0a56d5a717e24a1c272413ddfad0696d0a2d108547e9ad438c234b04dcdcac not found: ID does not exist" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.560015 4789 scope.go:117] "RemoveContainer" containerID="f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.561015 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0"} err="failed to get container status \"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0\": rpc error: code = NotFound desc = could not find container \"f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0\": container with ID starting with f68fa58fece36ef1aa4de09764029153e8cf584b71616ebc61256d0bd76ec5c0 not found: ID does not exist" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.561049 4789 scope.go:117] "RemoveContainer" containerID="7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.584217 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.621381 4789 generic.go:334] "Generic (PLEG): container finished" podID="1894718e-3dac-4430-9285-e397fb21e852" containerID="9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4" exitCode=0 Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.621468 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerDied","Data":"9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.624417 4789 generic.go:334] "Generic (PLEG): container finished" podID="db3b1c91-5558-4afb-a9fc-dd75527451ee" containerID="530a5d65e6717bf5cb2f7088a0efb5a0b81a55b156e61bc16cea37ce6d84bbe3" exitCode=0 Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.624478 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"db3b1c91-5558-4afb-a9fc-dd75527451ee","Type":"ContainerDied","Data":"530a5d65e6717bf5cb2f7088a0efb5a0b81a55b156e61bc16cea37ce6d84bbe3"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.625408 4789 scope.go:117] "RemoveContainer" containerID="7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.626253 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c\": container with ID starting with 7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c not found: ID does not exist" containerID="7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.626291 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c"} err="failed to get container status \"7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c\": rpc error: code = NotFound desc = could not find container \"7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c\": container with ID starting with 7fbc88c1dc7ef73de47ce7e62d2172c16a2b94bc8ef52b71d35d533f0fb9575c not found: ID does not exist" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.630761 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementdcc5-account-delete-bdqp9" event={"ID":"24b89e30-7a68-4c02-8386-cc104108a8ea","Type":"ContainerDied","Data":"28328e634aa30218f149f9ef5d6f07fdb26459d3788ada2018a1f826ada9399e"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.630792 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28328e634aa30218f149f9ef5d6f07fdb26459d3788ada2018a1f826ada9399e" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.630834 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementdcc5-account-delete-bdqp9" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.639700 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-public-tls-certs\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.639816 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-scripts\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.639863 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-etc-machine-id\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.639879 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-logs\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.639945 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-combined-ca-bundle\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.639962 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.639980 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" event={"ID":"b6eb716a-c1c8-47f7-bef8-68fbdf9162c0","Type":"ContainerDied","Data":"bc3870b72ed3ba51a30fc640835cb553744960bf695d8ff62857e88a889b781d"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.640023 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data-custom\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.640048 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-internal-tls-certs\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.640050 4789 scope.go:117] "RemoveContainer" containerID="2339da9c2a0e96f20cb674ffa23240a66f38f9520dccf9e45eba3c48effd6316" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.640109 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skc8q\" (UniqueName: \"kubernetes.io/projected/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-kube-api-access-skc8q\") pod \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\" (UID: \"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5\") " Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.640183 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm" Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.640541 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.640593 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data podName:31336d9f-38cf-4805-927b-3ae986f6c88e nodeName:}" failed. No retries permitted until 2025-12-16 07:16:27.640579257 +0000 UTC m=+1525.902466886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data") pod "rabbitmq-server-0" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e") : configmap "rabbitmq-config-data" not found Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.641335 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-logs" (OuterVolumeSpecName: "logs") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.645463 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-scripts" (OuterVolumeSpecName: "scripts") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.645513 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.657471 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-kube-api-access-skc8q" (OuterVolumeSpecName: "kube-api-access-skc8q") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "kube-api-access-skc8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.658178 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.658365 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5","Type":"ContainerDied","Data":"c06be19a5d6804f3e384f9fe32b42d0aabb7abea807e440b6d2e5268f311be81"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.664033 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b48fd45b4-hp2xw" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.665145 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b48fd45b4-hp2xw" event={"ID":"8368d044-b088-48f9-b5cb-19a95b997576","Type":"ContainerDied","Data":"368cc17b705bd079c2e4ff7f824a908affbcfc53863f91bea18600fda4e338e4"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.665733 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.676858 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbbc99994-cwbpw" event={"ID":"93b0a572-437e-4d15-a74d-e92c0f39c9cc","Type":"ContainerDied","Data":"985ea34c0a9778039db9bb9bc7969fabdc2e50c353ec77704428e8eca67e386c"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.677128 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbbc99994-cwbpw" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.690623 4789 scope.go:117] "RemoveContainer" containerID="38cb122304e1fdaa474b979a230d2c0c3d7a6825b41760a62e2b6923a3837070" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.696408 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-864d99d789-mv5rh" event={"ID":"f00adc24-beed-43df-95a8-274b841d60a0","Type":"ContainerDied","Data":"46a86ae055389e3e45af7fd0e66dfa93de8caa3ffd8d4abed4b008fc72618855"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.696501 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-864d99d789-mv5rh" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.703565 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.712141 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6cfddfd9f4-hmzjm"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.713176 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29706741-1258-454c-968f-836e472cb685","Type":"ContainerDied","Data":"83d208ca2b5fca84de26a8f9dde0666c1e7b4c4ed9527c11423f74e502920589"} Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.714002 4789 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapifc07-account-delete-bxbmv" secret="" err="secret \"galera-openstack-dockercfg-5nxcs\" not found" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.714612 4789 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/barbican30bf-account-delete-mwl8m" secret="" err="secret \"galera-openstack-dockercfg-5nxcs\" not found" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.714055 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cw7z9" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.714216 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.720320 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.724070 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data" (OuterVolumeSpecName: "config-data") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.743179 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.743207 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.743216 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.743403 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.743416 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.743426 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.743434 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skc8q\" (UniqueName: \"kubernetes.io/projected/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-kube-api-access-skc8q\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.745033 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b48fd45b4-hp2xw"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.759978 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b48fd45b4-hp2xw"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.771408 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bbbc99994-cwbpw"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.781487 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.782947 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-bbbc99994-cwbpw"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.791088 4789 scope.go:117] "RemoveContainer" containerID="dfe47974cb64535408aeb67063f1a4814aa8aaad5cefa3463823fe0dd085e7b6" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.792867 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" (UID: "fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.844415 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.844440 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.845482 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-864d99d789-mv5rh"] Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.867129 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:16:19 crc kubenswrapper[4789]: I1216 07:16:19.869107 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-864d99d789-mv5rh"] Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.875066 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.876308 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 07:16:19 crc kubenswrapper[4789]: E1216 07:16:19.876458 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8680ae27-3e72-416b-9983-9b195fedcefb" containerName="nova-scheduler-scheduler" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.007016 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.015155 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.051965 4789 scope.go:117] "RemoveContainer" containerID="075adba855be9f7509e9630110074278e486377f145b6b3fe7500199bbeb6d6c" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.063885 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.072167 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.079234 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.087672 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.108944 4789 scope.go:117] "RemoveContainer" containerID="f58f590eff39129dc3fd6cbf997894d78a3061978019b25eafe3b8d013aa5949" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.139370 4789 scope.go:117] "RemoveContainer" containerID="640501fd43f4fcce68155ec6cb24a721ad4ca1ea36ae7f97fe8a96f2974be91e" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.143264 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00293d36-0c18-4d79-aacd-4224045ff895" path="/var/lib/kubelet/pods/00293d36-0c18-4d79-aacd-4224045ff895/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.144205 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29706741-1258-454c-968f-836e472cb685" path="/var/lib/kubelet/pods/29706741-1258-454c-968f-836e472cb685/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.144754 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="358d8958-a563-407c-8b7f-75aee19a3699" path="/var/lib/kubelet/pods/358d8958-a563-407c-8b7f-75aee19a3699/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.145364 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8368d044-b088-48f9-b5cb-19a95b997576" path="/var/lib/kubelet/pods/8368d044-b088-48f9-b5cb-19a95b997576/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.150126 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" path="/var/lib/kubelet/pods/93b0a572-437e-4d15-a74d-e92c0f39c9cc/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.150986 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" path="/var/lib/kubelet/pods/b6eb716a-c1c8-47f7-bef8-68fbdf9162c0/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.159739 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9stf\" (UniqueName: \"kubernetes.io/projected/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-kube-api-access-c9stf\") pod \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.159794 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-logs\") pod \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.159906 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b6mh\" (UniqueName: \"kubernetes.io/projected/5031d0ac-42ac-4346-9403-0369a555ab4a-kube-api-access-4b6mh\") pod \"5031d0ac-42ac-4346-9403-0369a555ab4a\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.159942 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5031d0ac-42ac-4346-9403-0369a555ab4a-logs\") pod \"5031d0ac-42ac-4346-9403-0369a555ab4a\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.159986 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-public-tls-certs\") pod \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.160022 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-combined-ca-bundle\") pod \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.160125 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-nova-metadata-tls-certs\") pod \"5031d0ac-42ac-4346-9403-0369a555ab4a\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.160158 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-internal-tls-certs\") pod \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.160187 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-config-data\") pod \"5031d0ac-42ac-4346-9403-0369a555ab4a\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.160211 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-config-data\") pod \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\" (UID: \"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.160232 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-combined-ca-bundle\") pod \"5031d0ac-42ac-4346-9403-0369a555ab4a\" (UID: \"5031d0ac-42ac-4346-9403-0369a555ab4a\") " Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.160826 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.160838 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de637363-990a-4590-b9c5-ab66c18ec270" path="/var/lib/kubelet/pods/de637363-990a-4590-b9c5-ab66c18ec270/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.160887 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts podName:0491a70b-b044-4ec4-b179-778967cd4573 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:22.160872133 +0000 UTC m=+1520.422759762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts") pod "novaapifc07-account-delete-bxbmv" (UID: "0491a70b-b044-4ec4-b179-778967cd4573") : configmap "openstack-scripts" not found Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.162231 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20ac101-1bcb-4ca7-8a77-e827c5eb6383" path="/var/lib/kubelet/pods/e20ac101-1bcb-4ca7-8a77-e827c5eb6383/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.163134 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f00adc24-beed-43df-95a8-274b841d60a0" path="/var/lib/kubelet/pods/f00adc24-beed-43df-95a8-274b841d60a0/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.164693 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" path="/var/lib/kubelet/pods/f699c71b-1e44-4a4d-b1fb-77ef105af03d/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.165467 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" path="/var/lib/kubelet/pods/fc02bf7e-2d67-40a4-94b0-5807631a5b2e/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.166330 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" path="/var/lib/kubelet/pods/fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.167632 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8a84b3-ac5c-4ac8-a302-591548a970dd" path="/var/lib/kubelet/pods/fe8a84b3-ac5c-4ac8-a302-591548a970dd/volumes" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.168560 4789 scope.go:117] "RemoveContainer" containerID="063de08125ee34cdd307d91af26a93c26622ecd506fc1ef247a55845563c91d4" Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.172151 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.172365 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts podName:1964cf41-49d7-4b0d-ab8b-fbf9b621e359 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:22.172343593 +0000 UTC m=+1520.434231222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts") pod "barbican30bf-account-delete-mwl8m" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359") : configmap "openstack-scripts" not found Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.174891 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-logs" (OuterVolumeSpecName: "logs") pod "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" (UID: "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.175259 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5031d0ac-42ac-4346-9403-0369a555ab4a-kube-api-access-4b6mh" (OuterVolumeSpecName: "kube-api-access-4b6mh") pod "5031d0ac-42ac-4346-9403-0369a555ab4a" (UID: "5031d0ac-42ac-4346-9403-0369a555ab4a"). InnerVolumeSpecName "kube-api-access-4b6mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.181569 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5031d0ac-42ac-4346-9403-0369a555ab4a-logs" (OuterVolumeSpecName: "logs") pod "5031d0ac-42ac-4346-9403-0369a555ab4a" (UID: "5031d0ac-42ac-4346-9403-0369a555ab4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.182013 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-kube-api-access-c9stf" (OuterVolumeSpecName: "kube-api-access-c9stf") pod "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" (UID: "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84"). InnerVolumeSpecName "kube-api-access-c9stf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.209259 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-cw7z9"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.209291 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-cw7z9"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.230557 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" (UID: "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.240232 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-config-data" (OuterVolumeSpecName: "config-data") pod "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" (UID: "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.262597 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-config-data" (OuterVolumeSpecName: "config-data") pod "5031d0ac-42ac-4346-9403-0369a555ab4a" (UID: "5031d0ac-42ac-4346-9403-0369a555ab4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.263876 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9stf\" (UniqueName: \"kubernetes.io/projected/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-kube-api-access-c9stf\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.264034 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.264122 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b6mh\" (UniqueName: \"kubernetes.io/projected/5031d0ac-42ac-4346-9403-0369a555ab4a-kube-api-access-4b6mh\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.264187 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5031d0ac-42ac-4346-9403-0369a555ab4a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.264246 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.264305 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.264365 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.278656 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5031d0ac-42ac-4346-9403-0369a555ab4a" (UID: "5031d0ac-42ac-4346-9403-0369a555ab4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.290676 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.291346 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" (UID: "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.293562 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.295801 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.295855 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" containerName="nova-cell1-conductor-conductor" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.298583 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" (UID: "1d2a6bf4-542c-4b1c-b461-6b6f427a1d84"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.301508 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5031d0ac-42ac-4346-9403-0369a555ab4a" (UID: "5031d0ac-42ac-4346-9403-0369a555ab4a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.326964 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.339723 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.345125 4789 scope.go:117] "RemoveContainer" containerID="f71a167007b51e6b5519402191320729c426676f344ae45a6739ee1603881192" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.367850 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.367883 4789 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.367894 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.367903 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5031d0ac-42ac-4346-9403-0369a555ab4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.367983 4789 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:20 crc kubenswrapper[4789]: E1216 07:16:20.368030 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data podName:9452e1b2-42ec-47b6-96e1-2770c9e76db2 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:28.368015432 +0000 UTC m=+1526.629903061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data") pod "rabbitmq-cell1-server-0" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2") : configmap "rabbitmq-cell1-config-data" not found Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.407304 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.471758 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-httpd-run\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.471826 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4762f\" (UniqueName: \"kubernetes.io/projected/db3b1c91-5558-4afb-a9fc-dd75527451ee-kube-api-access-4762f\") pod \"db3b1c91-5558-4afb-a9fc-dd75527451ee\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.471866 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-config-data\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.471937 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.471998 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-logs\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.472162 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-scripts\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.472346 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.472384 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-logs" (OuterVolumeSpecName: "logs") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.475144 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476101 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-scripts" (OuterVolumeSpecName: "scripts") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476381 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-config-data\") pod \"db3b1c91-5558-4afb-a9fc-dd75527451ee\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476440 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-combined-ca-bundle\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476508 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-memcached-tls-certs\") pod \"db3b1c91-5558-4afb-a9fc-dd75527451ee\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476539 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-internal-tls-certs\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476566 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl9hf\" (UniqueName: \"kubernetes.io/projected/37216df1-3f61-412b-bffb-5e36812383f4-kube-api-access-cl9hf\") pod \"37216df1-3f61-412b-bffb-5e36812383f4\" (UID: \"37216df1-3f61-412b-bffb-5e36812383f4\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476618 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-kolla-config\") pod \"db3b1c91-5558-4afb-a9fc-dd75527451ee\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.476683 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-combined-ca-bundle\") pod \"db3b1c91-5558-4afb-a9fc-dd75527451ee\" (UID: \"db3b1c91-5558-4afb-a9fc-dd75527451ee\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.477448 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3b1c91-5558-4afb-a9fc-dd75527451ee-kube-api-access-4762f" (OuterVolumeSpecName: "kube-api-access-4762f") pod "db3b1c91-5558-4afb-a9fc-dd75527451ee" (UID: "db3b1c91-5558-4afb-a9fc-dd75527451ee"). InnerVolumeSpecName "kube-api-access-4762f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.478056 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-config-data" (OuterVolumeSpecName: "config-data") pod "db3b1c91-5558-4afb-a9fc-dd75527451ee" (UID: "db3b1c91-5558-4afb-a9fc-dd75527451ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.478405 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "db3b1c91-5558-4afb-a9fc-dd75527451ee" (UID: "db3b1c91-5558-4afb-a9fc-dd75527451ee"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.480540 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.480605 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.480632 4789 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db3b1c91-5558-4afb-a9fc-dd75527451ee-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.480664 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.480682 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4762f\" (UniqueName: \"kubernetes.io/projected/db3b1c91-5558-4afb-a9fc-dd75527451ee-kube-api-access-4762f\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.480735 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.481077 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37216df1-3f61-412b-bffb-5e36812383f4-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.482062 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37216df1-3f61-412b-bffb-5e36812383f4-kube-api-access-cl9hf" (OuterVolumeSpecName: "kube-api-access-cl9hf") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "kube-api-access-cl9hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.484253 4789 scope.go:117] "RemoveContainer" containerID="0f48a94f282f9b6fdc9bae2f55acd33cf0b6397237ba430e41c42e3e2660b0b4" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.486050 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.521166 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.525186 4789 scope.go:117] "RemoveContainer" containerID="305b96e7f12f126be8501fb24906a6e570466b444e3bf1c2ee9a66e30d5add7f" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.541773 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.553762 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.554762 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.565017 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.572176 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-config-data" (OuterVolumeSpecName: "config-data") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.582163 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db3b1c91-5558-4afb-a9fc-dd75527451ee" (UID: "db3b1c91-5558-4afb-a9fc-dd75527451ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.583707 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82acf941-5ce6-4e18-bc6d-1809296622eb-operator-scripts\") pod \"82acf941-5ce6-4e18-bc6d-1809296622eb\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.583753 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-public-tls-certs\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.583784 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.583807 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzb7\" (UniqueName: \"kubernetes.io/projected/7f2338e7-2de7-4149-bb6a-ae978c7e096a-kube-api-access-pdzb7\") pod \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.583861 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkfm2\" (UniqueName: \"kubernetes.io/projected/82acf941-5ce6-4e18-bc6d-1809296622eb-kube-api-access-gkfm2\") pod \"82acf941-5ce6-4e18-bc6d-1809296622eb\" (UID: \"82acf941-5ce6-4e18-bc6d-1809296622eb\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.583892 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs8vg\" (UniqueName: \"kubernetes.io/projected/a6423ab7-79a3-402c-9115-e54b5f29ad05-kube-api-access-rs8vg\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.583994 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/bf1af2cc-24b9-4786-befa-74623fca05f7-kube-api-access-bfw6l\") pod \"bf1af2cc-24b9-4786-befa-74623fca05f7\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584021 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1af2cc-24b9-4786-befa-74623fca05f7-operator-scripts\") pod \"bf1af2cc-24b9-4786-befa-74623fca05f7\" (UID: \"bf1af2cc-24b9-4786-befa-74623fca05f7\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584047 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-scripts\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584088 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-combined-ca-bundle\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584112 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2338e7-2de7-4149-bb6a-ae978c7e096a-operator-scripts\") pod \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\" (UID: \"7f2338e7-2de7-4149-bb6a-ae978c7e096a\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584144 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdea835b-f122-4db5-b7c1-ca180d9f3853-operator-scripts\") pod \"fdea835b-f122-4db5-b7c1-ca180d9f3853\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584172 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-config-data\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584200 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kkvn\" (UniqueName: \"kubernetes.io/projected/fdea835b-f122-4db5-b7c1-ca180d9f3853-kube-api-access-2kkvn\") pod \"fdea835b-f122-4db5-b7c1-ca180d9f3853\" (UID: \"fdea835b-f122-4db5-b7c1-ca180d9f3853\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584225 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-logs\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584249 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-httpd-run\") pod \"a6423ab7-79a3-402c-9115-e54b5f29ad05\" (UID: \"a6423ab7-79a3-402c-9115-e54b5f29ad05\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584545 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584567 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584579 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584591 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.584602 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl9hf\" (UniqueName: \"kubernetes.io/projected/37216df1-3f61-412b-bffb-5e36812383f4-kube-api-access-cl9hf\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.585067 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.585418 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82acf941-5ce6-4e18-bc6d-1809296622eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82acf941-5ce6-4e18-bc6d-1809296622eb" (UID: "82acf941-5ce6-4e18-bc6d-1809296622eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.591463 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "37216df1-3f61-412b-bffb-5e36812383f4" (UID: "37216df1-3f61-412b-bffb-5e36812383f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.591623 4789 scope.go:117] "RemoveContainer" containerID="a0a363829296ba32ca72eb47110c7682a3a5f2c237669efd7a55541ecf92e1ab" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.596897 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-scripts" (OuterVolumeSpecName: "scripts") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.600549 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.603038 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2338e7-2de7-4149-bb6a-ae978c7e096a-kube-api-access-pdzb7" (OuterVolumeSpecName: "kube-api-access-pdzb7") pod "7f2338e7-2de7-4149-bb6a-ae978c7e096a" (UID: "7f2338e7-2de7-4149-bb6a-ae978c7e096a"). InnerVolumeSpecName "kube-api-access-pdzb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.606308 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82acf941-5ce6-4e18-bc6d-1809296622eb-kube-api-access-gkfm2" (OuterVolumeSpecName: "kube-api-access-gkfm2") pod "82acf941-5ce6-4e18-bc6d-1809296622eb" (UID: "82acf941-5ce6-4e18-bc6d-1809296622eb"). InnerVolumeSpecName "kube-api-access-gkfm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.606820 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1af2cc-24b9-4786-befa-74623fca05f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf1af2cc-24b9-4786-befa-74623fca05f7" (UID: "bf1af2cc-24b9-4786-befa-74623fca05f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.609783 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f2338e7-2de7-4149-bb6a-ae978c7e096a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f2338e7-2de7-4149-bb6a-ae978c7e096a" (UID: "7f2338e7-2de7-4149-bb6a-ae978c7e096a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.610993 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-logs" (OuterVolumeSpecName: "logs") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.612881 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdea835b-f122-4db5-b7c1-ca180d9f3853-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdea835b-f122-4db5-b7c1-ca180d9f3853" (UID: "fdea835b-f122-4db5-b7c1-ca180d9f3853"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.613040 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1af2cc-24b9-4786-befa-74623fca05f7-kube-api-access-bfw6l" (OuterVolumeSpecName: "kube-api-access-bfw6l") pod "bf1af2cc-24b9-4786-befa-74623fca05f7" (UID: "bf1af2cc-24b9-4786-befa-74623fca05f7"). InnerVolumeSpecName "kube-api-access-bfw6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.613173 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6423ab7-79a3-402c-9115-e54b5f29ad05-kube-api-access-rs8vg" (OuterVolumeSpecName: "kube-api-access-rs8vg") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "kube-api-access-rs8vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.616171 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdea835b-f122-4db5-b7c1-ca180d9f3853-kube-api-access-2kkvn" (OuterVolumeSpecName: "kube-api-access-2kkvn") pod "fdea835b-f122-4db5-b7c1-ca180d9f3853" (UID: "fdea835b-f122-4db5-b7c1-ca180d9f3853"). InnerVolumeSpecName "kube-api-access-2kkvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.618802 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "db3b1c91-5558-4afb-a9fc-dd75527451ee" (UID: "db3b1c91-5558-4afb-a9fc-dd75527451ee"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.636656 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.641623 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.654935 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-config-data" (OuterVolumeSpecName: "config-data") pod "a6423ab7-79a3-402c-9115-e54b5f29ad05" (UID: "a6423ab7-79a3-402c-9115-e54b5f29ad05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.685630 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.685877 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzb7\" (UniqueName: \"kubernetes.io/projected/7f2338e7-2de7-4149-bb6a-ae978c7e096a-kube-api-access-pdzb7\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.685978 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686053 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkfm2\" (UniqueName: \"kubernetes.io/projected/82acf941-5ce6-4e18-bc6d-1809296622eb-kube-api-access-gkfm2\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686120 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs8vg\" (UniqueName: \"kubernetes.io/projected/a6423ab7-79a3-402c-9115-e54b5f29ad05-kube-api-access-rs8vg\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686173 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/bf1af2cc-24b9-4786-befa-74623fca05f7-kube-api-access-bfw6l\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686222 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1af2cc-24b9-4786-befa-74623fca05f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686270 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686339 4789 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/db3b1c91-5558-4afb-a9fc-dd75527451ee-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686392 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37216df1-3f61-412b-bffb-5e36812383f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686448 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686536 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f2338e7-2de7-4149-bb6a-ae978c7e096a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686592 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdea835b-f122-4db5-b7c1-ca180d9f3853-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686656 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6423ab7-79a3-402c-9115-e54b5f29ad05-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686746 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kkvn\" (UniqueName: \"kubernetes.io/projected/fdea835b-f122-4db5-b7c1-ca180d9f3853-kube-api-access-2kkvn\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686807 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686858 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6423ab7-79a3-402c-9115-e54b5f29ad05-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.686940 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82acf941-5ce6-4e18-bc6d-1809296622eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.710867 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.728776 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5031d0ac-42ac-4346-9403-0369a555ab4a","Type":"ContainerDied","Data":"12bc2b0ccc4896a733ff6dd23c362de481d00b71e764660a9d3291b2c0241000"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.729331 4789 scope.go:117] "RemoveContainer" containerID="eea19a9862fbb3d803e89b2d7c5b8ef5c9fd9bd0d293359d33b0d30a3cecccac" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.729055 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.736487 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.736873 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"db3b1c91-5558-4afb-a9fc-dd75527451ee","Type":"ContainerDied","Data":"49f1a7097b5b2b7be517e1823df3bb70c8861aa3dd39af5ac54e40693abb09fe"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.740148 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04bcc-account-delete-wzgv2" event={"ID":"fdea835b-f122-4db5-b7c1-ca180d9f3853","Type":"ContainerDied","Data":"7a9499badf06cab15799c2847dee27997b0abd458005dd60c0bd4bf7c8e78b2b"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.740185 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a9499badf06cab15799c2847dee27997b0abd458005dd60c0bd4bf7c8e78b2b" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.740236 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04bcc-account-delete-wzgv2" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.746324 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37216df1-3f61-412b-bffb-5e36812383f4","Type":"ContainerDied","Data":"b9db1e248cd008608e0c0683297e0cb2bb75ce3f9381a556ee2dccfb2dfb1a3c"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.746371 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.749125 4789 generic.go:334] "Generic (PLEG): container finished" podID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerID="238b569af7959004c01bd0394274b3ef8d6991fd0c3fdae6cc211fa624cb5354" exitCode=0 Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.749303 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31336d9f-38cf-4805-927b-3ae986f6c88e","Type":"ContainerDied","Data":"238b569af7959004c01bd0394274b3ef8d6991fd0c3fdae6cc211fa624cb5354"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.751049 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="00293d36-0c18-4d79-aacd-4224045ff895" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.197:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.751715 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9ba2-account-delete-tgp7b" event={"ID":"7f2338e7-2de7-4149-bb6a-ae978c7e096a","Type":"ContainerDied","Data":"6b9cc156b6a77e8c8e3f7c557a6352e3ae3051572a21e9eec09e97cc507e2c42"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.751814 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9cc156b6a77e8c8e3f7c557a6352e3ae3051572a21e9eec09e97cc507e2c42" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.751952 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9ba2-account-delete-tgp7b" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.756381 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d2a6bf4-542c-4b1c-b461-6b6f427a1d84","Type":"ContainerDied","Data":"6a2254f0c2484dc8347faedb131ca81c1258a021d3fe314d12a7dd12d706b108"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.756510 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.765728 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5298-account-delete-lmmgd" event={"ID":"82acf941-5ce6-4e18-bc6d-1809296622eb","Type":"ContainerDied","Data":"cc8cad5792de3024a72ca536635b38e2a98f99e729004df0773d87824e5dd86e"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.766015 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc8cad5792de3024a72ca536635b38e2a98f99e729004df0773d87824e5dd86e" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.766428 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5298-account-delete-lmmgd" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.766612 4789 scope.go:117] "RemoveContainer" containerID="c70725fd021c9c4c1eba5b71db3f401cd478eb9326f0510427dc30f5843bb19c" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.789428 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.796943 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.803273 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6423ab7-79a3-402c-9115-e54b5f29ad05","Type":"ContainerDied","Data":"f88bdaeef19a48892d151efa3947bbd68842a623b72aee8fcce3361a02b0092e"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.803392 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.806903 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder9a65-account-delete-cbfmk" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.806926 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder9a65-account-delete-cbfmk" event={"ID":"bf1af2cc-24b9-4786-befa-74623fca05f7","Type":"ContainerDied","Data":"a1872839aff7c6aaf90d00b55620f39375ddd1da68d35e0d41748179ba2ee470"} Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.807097 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1872839aff7c6aaf90d00b55620f39375ddd1da68d35e0d41748179ba2ee470" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.808885 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.840561 4789 scope.go:117] "RemoveContainer" containerID="530a5d65e6717bf5cb2f7088a0efb5a0b81a55b156e61bc16cea37ce6d84bbe3" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.869907 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.879982 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.886761 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.893583 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.984037 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.993967 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31336d9f-38cf-4805-927b-3ae986f6c88e-erlang-cookie-secret\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.994032 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31336d9f-38cf-4805-927b-3ae986f6c88e-pod-info\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.994118 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-plugins-conf\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.994252 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-erlang-cookie\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.994419 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxw8s\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-kube-api-access-wxw8s\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.994599 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-tls\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.994749 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-confd\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.994898 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-server-conf\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.995206 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-plugins\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.995243 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.995812 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data\") pod \"31336d9f-38cf-4805-927b-3ae986f6c88e\" (UID: \"31336d9f-38cf-4805-927b-3ae986f6c88e\") " Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.996772 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.997089 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:20 crc kubenswrapper[4789]: I1216 07:16:20.997293 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.004504 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31336d9f-38cf-4805-927b-3ae986f6c88e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.004618 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/31336d9f-38cf-4805-927b-3ae986f6c88e-pod-info" (OuterVolumeSpecName: "pod-info") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.005353 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.010362 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-kube-api-access-wxw8s" (OuterVolumeSpecName: "kube-api-access-wxw8s") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "kube-api-access-wxw8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.013100 4789 scope.go:117] "RemoveContainer" containerID="82b67d2f0b7d827d390a1737f28832c66819bfb58a92aae85e467319754e80a4" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.013870 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.018043 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.025893 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.046511 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data" (OuterVolumeSpecName: "config-data") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.055954 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.065050 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.096586 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-server-conf" (OuterVolumeSpecName: "server-conf") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097588 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097604 4789 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31336d9f-38cf-4805-927b-3ae986f6c88e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097616 4789 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31336d9f-38cf-4805-927b-3ae986f6c88e-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097624 4789 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097632 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097640 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxw8s\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-kube-api-access-wxw8s\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097648 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097656 4789 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31336d9f-38cf-4805-927b-3ae986f6c88e-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097665 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.097691 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.114662 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.141379 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "31336d9f-38cf-4805-927b-3ae986f6c88e" (UID: "31336d9f-38cf-4805-927b-3ae986f6c88e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.199412 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31336d9f-38cf-4805-927b-3ae986f6c88e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.199442 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.302386 4789 scope.go:117] "RemoveContainer" containerID="1ca313e4e286bdf363a60db146512417c224d3addedf2f21605dd96befee2ec7" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.367345 4789 scope.go:117] "RemoveContainer" containerID="17038ebe8bb333b1b73372f442dc14a740d2ad41822921ccc7adc857ef4a9c8b" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.465281 4789 scope.go:117] "RemoveContainer" containerID="4d63dd7640d74fbc26ecb3092e3b345de818b9a1c89962de76be7288485fe546" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.489632 4789 scope.go:117] "RemoveContainer" containerID="79b3fcff6d02b1d1105cdaa7d49563ca416afc3d0d209ae94eae9fd336eca759" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.534584 4789 scope.go:117] "RemoveContainer" containerID="e13e48820a043301c899b786263392f63ecec23d16cafd76439ce501fb5d2638" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.633656 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720338 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-internal-tls-certs\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720385 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-fernet-keys\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720462 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-combined-ca-bundle\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720524 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-config-data\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720567 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-credential-keys\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720585 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6pff\" (UniqueName: \"kubernetes.io/projected/254f667d-eae3-486b-b9e8-ffc571d65635-kube-api-access-k6pff\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720608 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-public-tls-certs\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.720663 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-scripts\") pod \"254f667d-eae3-486b-b9e8-ffc571d65635\" (UID: \"254f667d-eae3-486b-b9e8-ffc571d65635\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.726330 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.726967 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.727788 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-scripts" (OuterVolumeSpecName: "scripts") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.730424 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254f667d-eae3-486b-b9e8-ffc571d65635-kube-api-access-k6pff" (OuterVolumeSpecName: "kube-api-access-k6pff") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "kube-api-access-k6pff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.761904 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.781350 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.782342 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.791361 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-config-data" (OuterVolumeSpecName: "config-data") pod "254f667d-eae3-486b-b9e8-ffc571d65635" (UID: "254f667d-eae3-486b-b9e8-ffc571d65635"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822387 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822417 4789 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822425 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822480 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822488 4789 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822497 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6pff\" (UniqueName: \"kubernetes.io/projected/254f667d-eae3-486b-b9e8-ffc571d65635-kube-api-access-k6pff\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822507 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822514 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254f667d-eae3-486b-b9e8-ffc571d65635-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.822902 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5vnz5"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.829874 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5vnz5"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.832126 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_63f88379-6b15-47a6-bf24-7cf0b3edc56a/ovn-northd/0.log" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.832196 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.843969 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9a65-account-create-update-ghjsp"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.844817 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.845758 4789 generic.go:334] "Generic (PLEG): container finished" podID="8680ae27-3e72-416b-9983-9b195fedcefb" containerID="3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b" exitCode=0 Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.845825 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8680ae27-3e72-416b-9983-9b195fedcefb","Type":"ContainerDied","Data":"3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.855876 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder9a65-account-delete-cbfmk"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.865739 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.865769 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31336d9f-38cf-4805-927b-3ae986f6c88e","Type":"ContainerDied","Data":"664e34f45fd05d46960a179707cf44e99059311b9833755a4dd8fc792989bfe9"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.865825 4789 scope.go:117] "RemoveContainer" containerID="238b569af7959004c01bd0394274b3ef8d6991fd0c3fdae6cc211fa624cb5354" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.868720 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9a65-account-create-update-ghjsp"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.873728 4789 generic.go:334] "Generic (PLEG): container finished" podID="254f667d-eae3-486b-b9e8-ffc571d65635" containerID="d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269" exitCode=0 Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.873844 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6789db9888-57dmq" event={"ID":"254f667d-eae3-486b-b9e8-ffc571d65635","Type":"ContainerDied","Data":"d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.873875 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6789db9888-57dmq" event={"ID":"254f667d-eae3-486b-b9e8-ffc571d65635","Type":"ContainerDied","Data":"2aa198f94427fbc639cd9718e61ee6d951f034ffbb159b6995767295359c5369"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.873971 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6789db9888-57dmq" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.879432 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder9a65-account-delete-cbfmk"] Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.880941 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_63f88379-6b15-47a6-bf24-7cf0b3edc56a/ovn-northd/0.log" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.880985 4789 generic.go:334] "Generic (PLEG): container finished" podID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" exitCode=139 Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.881038 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"63f88379-6b15-47a6-bf24-7cf0b3edc56a","Type":"ContainerDied","Data":"81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.881066 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"63f88379-6b15-47a6-bf24-7cf0b3edc56a","Type":"ContainerDied","Data":"e341a1ba4af9cf60c5f85558bfe8ebe06fb1fae67a426bf30ca491dc31891589"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.881121 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.922255 4789 generic.go:334] "Generic (PLEG): container finished" podID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerID="fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362" exitCode=0 Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.922323 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.922330 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9452e1b2-42ec-47b6-96e1-2770c9e76db2","Type":"ContainerDied","Data":"fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.922576 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9452e1b2-42ec-47b6-96e1-2770c9e76db2","Type":"ContainerDied","Data":"5c4015381169ad0d393422d1b91124ef0e4d0ced98d367a59839b4ee28aa3294"} Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.923527 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-northd-tls-certs\") pod \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.923623 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-plugins-conf\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.923692 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.923723 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-combined-ca-bundle\") pod \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.923941 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-confd\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.923968 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-scripts\") pod \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.925857 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.926237 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-metrics-certs-tls-certs\") pod \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.926273 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-rundir\") pod \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.926302 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.926333 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp9kb\" (UniqueName: \"kubernetes.io/projected/63f88379-6b15-47a6-bf24-7cf0b3edc56a-kube-api-access-dp9kb\") pod \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.926398 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9452e1b2-42ec-47b6-96e1-2770c9e76db2-pod-info\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.926472 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-tls\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.927634 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-erlang-cookie\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.927711 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9452e1b2-42ec-47b6-96e1-2770c9e76db2-erlang-cookie-secret\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.927764 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-server-conf\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.927802 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-plugins\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.927848 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pxnm\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-kube-api-access-9pxnm\") pod \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\" (UID: \"9452e1b2-42ec-47b6-96e1-2770c9e76db2\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.927893 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-config\") pod \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\" (UID: \"63f88379-6b15-47a6-bf24-7cf0b3edc56a\") " Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.928938 4789 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.930218 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "63f88379-6b15-47a6-bf24-7cf0b3edc56a" (UID: "63f88379-6b15-47a6-bf24-7cf0b3edc56a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.932148 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.932397 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.933009 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-config" (OuterVolumeSpecName: "config") pod "63f88379-6b15-47a6-bf24-7cf0b3edc56a" (UID: "63f88379-6b15-47a6-bf24-7cf0b3edc56a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.933328 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-scripts" (OuterVolumeSpecName: "scripts") pod "63f88379-6b15-47a6-bf24-7cf0b3edc56a" (UID: "63f88379-6b15-47a6-bf24-7cf0b3edc56a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.957176 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.965068 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.968360 4789 scope.go:117] "RemoveContainer" containerID="ff767a7cabcaa4f9752eac58d5657fbc09c94d5629fc004f9ab8e06f780b0a62" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.983963 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9452e1b2-42ec-47b6-96e1-2770c9e76db2-pod-info" (OuterVolumeSpecName: "pod-info") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.985742 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f88379-6b15-47a6-bf24-7cf0b3edc56a-kube-api-access-dp9kb" (OuterVolumeSpecName: "kube-api-access-dp9kb") pod "63f88379-6b15-47a6-bf24-7cf0b3edc56a" (UID: "63f88379-6b15-47a6-bf24-7cf0b3edc56a"). InnerVolumeSpecName "kube-api-access-dp9kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:21 crc kubenswrapper[4789]: I1216 07:16:21.985809 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-kube-api-access-9pxnm" (OuterVolumeSpecName: "kube-api-access-9pxnm") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "kube-api-access-9pxnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.006853 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9452e1b2-42ec-47b6-96e1-2770c9e76db2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.014268 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6789db9888-57dmq"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.024599 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data" (OuterVolumeSpecName: "config-data") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030160 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030195 4789 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9452e1b2-42ec-47b6-96e1-2770c9e76db2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030207 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030219 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pxnm\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-kube-api-access-9pxnm\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030231 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030242 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030254 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63f88379-6b15-47a6-bf24-7cf0b3edc56a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030265 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030287 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030299 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp9kb\" (UniqueName: \"kubernetes.io/projected/63f88379-6b15-47a6-bf24-7cf0b3edc56a-kube-api-access-dp9kb\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030310 4789 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9452e1b2-42ec-47b6-96e1-2770c9e76db2-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.030321 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.034896 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6789db9888-57dmq"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.046718 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.057025 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.059374 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.064163 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-n2j8s"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.065006 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "63f88379-6b15-47a6-bf24-7cf0b3edc56a" (UID: "63f88379-6b15-47a6-bf24-7cf0b3edc56a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.091645 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63f88379-6b15-47a6-bf24-7cf0b3edc56a" (UID: "63f88379-6b15-47a6-bf24-7cf0b3edc56a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.082180 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-n2j8s"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.115614 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementdcc5-account-delete-bdqp9"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.122206 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "63f88379-6b15-47a6-bf24-7cf0b3edc56a" (UID: "63f88379-6b15-47a6-bf24-7cf0b3edc56a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.124140 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-server-conf" (OuterVolumeSpecName: "server-conf") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.135596 4789 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9452e1b2-42ec-47b6-96e1-2770c9e76db2-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.135632 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.135645 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.135659 4789 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63f88379-6b15-47a6-bf24-7cf0b3edc56a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.135672 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.145939 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9452e1b2-42ec-47b6-96e1-2770c9e76db2" (UID: "9452e1b2-42ec-47b6-96e1-2770c9e76db2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.148828 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d1ec63-1e07-4430-84f6-6f356d6cb420" path="/var/lib/kubelet/pods/16d1ec63-1e07-4430-84f6-6f356d6cb420/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.152165 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" path="/var/lib/kubelet/pods/1d2a6bf4-542c-4b1c-b461-6b6f427a1d84/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.153957 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" path="/var/lib/kubelet/pods/1e16a3ef-920e-493a-ae2f-7336d64bbd7e/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.155687 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.156874 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254f667d-eae3-486b-b9e8-ffc571d65635" path="/var/lib/kubelet/pods/254f667d-eae3-486b-b9e8-ffc571d65635/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.157941 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" path="/var/lib/kubelet/pods/31336d9f-38cf-4805-927b-3ae986f6c88e/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.159576 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37216df1-3f61-412b-bffb-5e36812383f4" path="/var/lib/kubelet/pods/37216df1-3f61-412b-bffb-5e36812383f4/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.160572 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" path="/var/lib/kubelet/pods/5031d0ac-42ac-4346-9403-0369a555ab4a/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.161045 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.161090 4789 scope.go:117] "RemoveContainer" containerID="d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.161221 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.161567 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" path="/var/lib/kubelet/pods/a6423ab7-79a3-402c-9115-e54b5f29ad05/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.163385 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.163815 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.163850 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.165635 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.165672 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.180211 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb595623-26e8-470c-bfa0-565282778cbb" path="/var/lib/kubelet/pods/bb595623-26e8-470c-bfa0-565282778cbb/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.186399 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1af2cc-24b9-4786-befa-74623fca05f7" path="/var/lib/kubelet/pods/bf1af2cc-24b9-4786-befa-74623fca05f7/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.187568 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf70f2c3-1a4b-44e8-87e7-1d03a302998d" path="/var/lib/kubelet/pods/cf70f2c3-1a4b-44e8-87e7-1d03a302998d/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.188269 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3b1c91-5558-4afb-a9fc-dd75527451ee" path="/var/lib/kubelet/pods/db3b1c91-5558-4afb-a9fc-dd75527451ee/volumes" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.189311 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-dcc5-account-create-update-lx5zl"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.189344 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementdcc5-account-delete-bdqp9"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.189365 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-dcc5-account-create-update-lx5zl"] Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.236888 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.236957 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9452e1b2-42ec-47b6-96e1-2770c9e76db2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.236980 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts podName:1964cf41-49d7-4b0d-ab8b-fbf9b621e359 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:26.236961954 +0000 UTC m=+1524.498849583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts") pod "barbican30bf-account-delete-mwl8m" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359") : configmap "openstack-scripts" not found Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.237510 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.237829 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts podName:0491a70b-b044-4ec4-b179-778967cd4573 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:26.237729623 +0000 UTC m=+1524.499617292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts") pod "novaapifc07-account-delete-bxbmv" (UID: "0491a70b-b044-4ec4-b179-778967cd4573") : configmap "openstack-scripts" not found Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.239631 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-cq5lc"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.263262 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-cq5lc"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.274695 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.275607 4789 scope.go:117] "RemoveContainer" containerID="d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.276468 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269\": container with ID starting with d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269 not found: ID does not exist" containerID="d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.276532 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269"} err="failed to get container status \"d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269\": rpc error: code = NotFound desc = could not find container \"d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269\": container with ID starting with d692f46a7094135d35322549141f402d1d598d636d6cc9541426ada1bb20e269 not found: ID does not exist" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.276558 4789 scope.go:117] "RemoveContainer" containerID="c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.288090 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9ba2-account-delete-tgp7b"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.313604 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9ba2-account-create-update-2225k"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.314262 4789 scope.go:117] "RemoveContainer" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.329003 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance9ba2-account-delete-tgp7b"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.335747 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9ba2-account-create-update-2225k"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.338107 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp5w5\" (UniqueName: \"kubernetes.io/projected/8680ae27-3e72-416b-9983-9b195fedcefb-kube-api-access-tp5w5\") pod \"8680ae27-3e72-416b-9983-9b195fedcefb\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.338259 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-combined-ca-bundle\") pod \"8680ae27-3e72-416b-9983-9b195fedcefb\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.338331 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-config-data\") pod \"8680ae27-3e72-416b-9983-9b195fedcefb\" (UID: \"8680ae27-3e72-416b-9983-9b195fedcefb\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.342139 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8680ae27-3e72-416b-9983-9b195fedcefb-kube-api-access-tp5w5" (OuterVolumeSpecName: "kube-api-access-tp5w5") pod "8680ae27-3e72-416b-9983-9b195fedcefb" (UID: "8680ae27-3e72-416b-9983-9b195fedcefb"). InnerVolumeSpecName "kube-api-access-tp5w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.342811 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fmwzn"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.349229 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fmwzn"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.349769 4789 scope.go:117] "RemoveContainer" containerID="c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.350833 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985\": container with ID starting with c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985 not found: ID does not exist" containerID="c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.350877 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985"} err="failed to get container status \"c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985\": rpc error: code = NotFound desc = could not find container \"c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985\": container with ID starting with c4c014f1ae773c1956037a7902631dcf1e7f25d1344b3ebdfdc9dfd1321c6985 not found: ID does not exist" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.351979 4789 scope.go:117] "RemoveContainer" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.353227 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431\": container with ID starting with 81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431 not found: ID does not exist" containerID="81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.353283 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431"} err="failed to get container status \"81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431\": rpc error: code = NotFound desc = could not find container \"81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431\": container with ID starting with 81dd05baa6ed3ee70b579efac8fbbccaffecd93ad2ce469ffb03352327a00431 not found: ID does not exist" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.353313 4789 scope.go:117] "RemoveContainer" containerID="fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.357773 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5298-account-create-update-hrzjv"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.365889 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-config-data" (OuterVolumeSpecName: "config-data") pod "8680ae27-3e72-416b-9983-9b195fedcefb" (UID: "8680ae27-3e72-416b-9983-9b195fedcefb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.369119 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron5298-account-delete-lmmgd"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.384808 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron5298-account-delete-lmmgd"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.388537 4789 scope.go:117] "RemoveContainer" containerID="e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.406711 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5298-account-create-update-hrzjv"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.419906 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.430100 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8680ae27-3e72-416b-9983-9b195fedcefb" (UID: "8680ae27-3e72-416b-9983-9b195fedcefb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.437424 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.453539 4789 scope.go:117] "RemoveContainer" containerID="fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.456106 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362\": container with ID starting with fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362 not found: ID does not exist" containerID="fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.456176 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362"} err="failed to get container status \"fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362\": rpc error: code = NotFound desc = could not find container \"fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362\": container with ID starting with fb92d13658cf48498ed6544082cfedccbb9355670cbbf2669fcf37aa29c9f362 not found: ID does not exist" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.456215 4789 scope.go:117] "RemoveContainer" containerID="e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.456462 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp5w5\" (UniqueName: \"kubernetes.io/projected/8680ae27-3e72-416b-9983-9b195fedcefb-kube-api-access-tp5w5\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.456492 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.456513 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8680ae27-3e72-416b-9983-9b195fedcefb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: E1216 07:16:22.458607 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae\": container with ID starting with e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae not found: ID does not exist" containerID="e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.458686 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae"} err="failed to get container status \"e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae\": rpc error: code = NotFound desc = could not find container \"e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae\": container with ID starting with e1f35eb24585f3f70bc45aba7e94922af40a7ff72fd63976f3c1a611e4d4eeae not found: ID does not exist" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.506819 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.521693 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.537276 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-9cs4h"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.543172 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-9cs4h"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.555257 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-30bf-account-create-update-cgmvn"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.561605 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-30bf-account-create-update-cgmvn"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.567345 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican30bf-account-delete-mwl8m"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.567603 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican30bf-account-delete-mwl8m" podUID="1964cf41-49d7-4b0d-ab8b-fbf9b621e359" containerName="mariadb-account-delete" containerID="cri-o://c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2" gracePeriod=30 Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.574811 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.601758 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-77l4n"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.613133 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-77l4n"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.621794 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapifc07-account-delete-bxbmv"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.622057 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapifc07-account-delete-bxbmv" podUID="0491a70b-b044-4ec4-b179-778967cd4573" containerName="mariadb-account-delete" containerID="cri-o://e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7" gracePeriod=30 Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.641540 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fc07-account-create-update-zxl2v"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.647239 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fc07-account-create-update-zxl2v"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660383 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-default\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660450 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-galera-tls-certs\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660562 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-kolla-config\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660582 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-combined-ca-bundle\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660614 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-operator-scripts\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660667 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-generated\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660694 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2djvc\" (UniqueName: \"kubernetes.io/projected/d868c627-a661-4c69-afd7-26d88b2be0ec-kube-api-access-2djvc\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.660714 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d868c627-a661-4c69-afd7-26d88b2be0ec\" (UID: \"d868c627-a661-4c69-afd7-26d88b2be0ec\") " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.666812 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.667061 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.667527 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.668272 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.677297 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.677597 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d868c627-a661-4c69-afd7-26d88b2be0ec-kube-api-access-2djvc" (OuterVolumeSpecName: "kube-api-access-2djvc") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "kube-api-access-2djvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.711964 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-76kjb"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.715133 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.717120 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d868c627-a661-4c69-afd7-26d88b2be0ec" (UID: "d868c627-a661-4c69-afd7-26d88b2be0ec"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.719941 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-76kjb"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.728212 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell04bcc-account-delete-wzgv2"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.737313 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell04bcc-account-delete-wzgv2"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.744403 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4bcc-account-create-update-xk8nk"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.750211 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4bcc-account-create-update-xk8nk"] Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762560 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762587 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2djvc\" (UniqueName: \"kubernetes.io/projected/d868c627-a661-4c69-afd7-26d88b2be0ec-kube-api-access-2djvc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762616 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762629 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762641 4789 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762653 4789 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762662 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d868c627-a661-4c69-afd7-26d88b2be0ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.762672 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d868c627-a661-4c69-afd7-26d88b2be0ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.781899 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 16 07:16:22 crc kubenswrapper[4789]: I1216 07:16:22.864506 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.004613 4789 generic.go:334] "Generic (PLEG): container finished" podID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerID="08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d" exitCode=0 Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.004718 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d868c627-a661-4c69-afd7-26d88b2be0ec","Type":"ContainerDied","Data":"08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d"} Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.004752 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d868c627-a661-4c69-afd7-26d88b2be0ec","Type":"ContainerDied","Data":"5739231fd2be0a2e5e2b3764d4bc5bab7c8d3c4dae5e2c3b256d50500c494c79"} Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.004776 4789 scope.go:117] "RemoveContainer" containerID="08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.004896 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.011381 4789 generic.go:334] "Generic (PLEG): container finished" podID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" exitCode=0 Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.011428 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe","Type":"ContainerDied","Data":"219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09"} Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.017346 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8680ae27-3e72-416b-9983-9b195fedcefb","Type":"ContainerDied","Data":"7572bea88ddbb6452e0666949ad98511513c23c8c2ce5bf46cd5129000010ee4"} Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.017450 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.027244 4789 scope.go:117] "RemoveContainer" containerID="d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.060988 4789 scope.go:117] "RemoveContainer" containerID="08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d" Dec 16 07:16:23 crc kubenswrapper[4789]: E1216 07:16:23.061825 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d\": container with ID starting with 08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d not found: ID does not exist" containerID="08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.062120 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d"} err="failed to get container status \"08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d\": rpc error: code = NotFound desc = could not find container \"08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d\": container with ID starting with 08ad7f71944f2f548c51a0d102a9a098e4725f61dda8235ac5d3f2dbc63cc08d not found: ID does not exist" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.062161 4789 scope.go:117] "RemoveContainer" containerID="d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5" Dec 16 07:16:23 crc kubenswrapper[4789]: E1216 07:16:23.062511 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5\": container with ID starting with d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5 not found: ID does not exist" containerID="d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.062545 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5"} err="failed to get container status \"d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5\": rpc error: code = NotFound desc = could not find container \"d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5\": container with ID starting with d4d3e2926ac80dbc928dcdcc175af65b3929abe6f85d3db598b7ca7ab07a3db5 not found: ID does not exist" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.062565 4789 scope.go:117] "RemoveContainer" containerID="3d46fd99e031b9de62468d6125ed89892bfb7d7abb0287e92c56bc86ee44280b" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.074688 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.080440 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.086454 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.092471 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.179530 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.270845 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-combined-ca-bundle\") pod \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.271044 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-config-data\") pod \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.271112 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2bmd\" (UniqueName: \"kubernetes.io/projected/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-kube-api-access-m2bmd\") pod \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\" (UID: \"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe\") " Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.274966 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-kube-api-access-m2bmd" (OuterVolumeSpecName: "kube-api-access-m2bmd") pod "e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" (UID: "e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe"). InnerVolumeSpecName "kube-api-access-m2bmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.290514 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-config-data" (OuterVolumeSpecName: "config-data") pod "e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" (UID: "e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.290571 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" (UID: "e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.372879 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.372927 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2bmd\" (UniqueName: \"kubernetes.io/projected/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-kube-api-access-m2bmd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:23 crc kubenswrapper[4789]: I1216 07:16:23.372938 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.026354 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe","Type":"ContainerDied","Data":"fd572611943e791e5dca7dab1ae251e4ab9c924d8ab92f0a8b22914c910b1c53"} Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.026720 4789 scope.go:117] "RemoveContainer" containerID="219a139fb24b44e015b55c8a65d3933e6f97654b0e0c3d160e73532b484e6c09" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.026380 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.063596 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.068878 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.113533 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15382adc-269b-498c-ae42-a5e8a681e386" path="/var/lib/kubelet/pods/15382adc-269b-498c-ae42-a5e8a681e386/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.114107 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b89e30-7a68-4c02-8386-cc104108a8ea" path="/var/lib/kubelet/pods/24b89e30-7a68-4c02-8386-cc104108a8ea/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.114678 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" path="/var/lib/kubelet/pods/63f88379-6b15-47a6-bf24-7cf0b3edc56a/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.115709 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8b69c5-5882-42d9-8154-1a39e0b55178" path="/var/lib/kubelet/pods/7a8b69c5-5882-42d9-8154-1a39e0b55178/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.116241 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2c4ef3-c9dd-497a-b092-d257ed4ef992" path="/var/lib/kubelet/pods/7b2c4ef3-c9dd-497a-b092-d257ed4ef992/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.116728 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2338e7-2de7-4149-bb6a-ae978c7e096a" path="/var/lib/kubelet/pods/7f2338e7-2de7-4149-bb6a-ae978c7e096a/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.117207 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82acf941-5ce6-4e18-bc6d-1809296622eb" path="/var/lib/kubelet/pods/82acf941-5ce6-4e18-bc6d-1809296622eb/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.118955 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8680ae27-3e72-416b-9983-9b195fedcefb" path="/var/lib/kubelet/pods/8680ae27-3e72-416b-9983-9b195fedcefb/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.119456 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2aa4a5-f152-4361-822c-a114f9b41b49" path="/var/lib/kubelet/pods/8a2aa4a5-f152-4361-822c-a114f9b41b49/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.120118 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" path="/var/lib/kubelet/pods/9452e1b2-42ec-47b6-96e1-2770c9e76db2/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.121223 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9740f406-8da5-496b-a8c7-b0c7474fe4da" path="/var/lib/kubelet/pods/9740f406-8da5-496b-a8c7-b0c7474fe4da/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.121771 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980cd36a-7926-48a8-9749-559317eeee7f" path="/var/lib/kubelet/pods/980cd36a-7926-48a8-9749-559317eeee7f/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.122472 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d" path="/var/lib/kubelet/pods/9ef67dda-a2a4-4ad0-99e3-3b918fdaca0d/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.123566 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f1d157-db58-4392-911c-344fcc5a8ce1" path="/var/lib/kubelet/pods/a7f1d157-db58-4392-911c-344fcc5a8ce1/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.124124 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d78f46-553d-47a0-a433-445b66500e1c" path="/var/lib/kubelet/pods/d4d78f46-553d-47a0-a433-445b66500e1c/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.124810 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f8aee7-df50-4d02-bdc6-a0feacc6868b" path="/var/lib/kubelet/pods/d7f8aee7-df50-4d02-bdc6-a0feacc6868b/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.125967 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" path="/var/lib/kubelet/pods/d868c627-a661-4c69-afd7-26d88b2be0ec/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.126494 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" path="/var/lib/kubelet/pods/e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.127024 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20b8e31-ab07-411d-844b-f69077acbe95" path="/var/lib/kubelet/pods/f20b8e31-ab07-411d-844b-f69077acbe95/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.128013 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdea835b-f122-4db5-b7c1-ca180d9f3853" path="/var/lib/kubelet/pods/fdea835b-f122-4db5-b7c1-ca180d9f3853/volumes" Dec 16 07:16:24 crc kubenswrapper[4789]: I1216 07:16:24.902632 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.001857 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-config-data\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.002224 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-ceilometer-tls-certs\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.002368 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-run-httpd\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.002398 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-combined-ca-bundle\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.002435 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bwp7\" (UniqueName: \"kubernetes.io/projected/1894718e-3dac-4430-9285-e397fb21e852-kube-api-access-6bwp7\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.002499 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-scripts\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.002518 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-log-httpd\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.002580 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-sg-core-conf-yaml\") pod \"1894718e-3dac-4430-9285-e397fb21e852\" (UID: \"1894718e-3dac-4430-9285-e397fb21e852\") " Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.003303 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.003538 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.003627 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.018801 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-scripts" (OuterVolumeSpecName: "scripts") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.020213 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1894718e-3dac-4430-9285-e397fb21e852-kube-api-access-6bwp7" (OuterVolumeSpecName: "kube-api-access-6bwp7") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "kube-api-access-6bwp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.025346 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.043065 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.043873 4789 generic.go:334] "Generic (PLEG): container finished" podID="1894718e-3dac-4430-9285-e397fb21e852" containerID="02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0" exitCode=0 Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.043895 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerDied","Data":"02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0"} Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.044349 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1894718e-3dac-4430-9285-e397fb21e852","Type":"ContainerDied","Data":"068789b909c49a0acec2052a1433774096d7594593fe4299026c45169c555698"} Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.043968 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.044507 4789 scope.go:117] "RemoveContainer" containerID="231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.058344 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.082378 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-config-data" (OuterVolumeSpecName: "config-data") pod "1894718e-3dac-4430-9285-e397fb21e852" (UID: "1894718e-3dac-4430-9285-e397fb21e852"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.104775 4789 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.104810 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.104819 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bwp7\" (UniqueName: \"kubernetes.io/projected/1894718e-3dac-4430-9285-e397fb21e852-kube-api-access-6bwp7\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.104828 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.104837 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1894718e-3dac-4430-9285-e397fb21e852-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.104847 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.104854 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1894718e-3dac-4430-9285-e397fb21e852-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.114700 4789 scope.go:117] "RemoveContainer" containerID="c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.137785 4789 scope.go:117] "RemoveContainer" containerID="02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.167275 4789 scope.go:117] "RemoveContainer" containerID="9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.186832 4789 scope.go:117] "RemoveContainer" containerID="231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331" Dec 16 07:16:25 crc kubenswrapper[4789]: E1216 07:16:25.187417 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331\": container with ID starting with 231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331 not found: ID does not exist" containerID="231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.187478 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331"} err="failed to get container status \"231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331\": rpc error: code = NotFound desc = could not find container \"231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331\": container with ID starting with 231a3bb2df245d9160ce3501b6b1f7b640e92835557ce784e001bba1eed2c331 not found: ID does not exist" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.187512 4789 scope.go:117] "RemoveContainer" containerID="c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6" Dec 16 07:16:25 crc kubenswrapper[4789]: E1216 07:16:25.187836 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6\": container with ID starting with c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6 not found: ID does not exist" containerID="c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.187862 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6"} err="failed to get container status \"c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6\": rpc error: code = NotFound desc = could not find container \"c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6\": container with ID starting with c40c9fc0aaf15a19a20614883578afb6a04f14c1b00a2fdce09ed29ed6f4c8e6 not found: ID does not exist" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.187882 4789 scope.go:117] "RemoveContainer" containerID="02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0" Dec 16 07:16:25 crc kubenswrapper[4789]: E1216 07:16:25.188531 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0\": container with ID starting with 02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0 not found: ID does not exist" containerID="02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.188574 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0"} err="failed to get container status \"02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0\": rpc error: code = NotFound desc = could not find container \"02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0\": container with ID starting with 02c58a51cf4ac9991f6c4949e4d87731a9a5a24ce69d1f74393a2f6f14e083b0 not found: ID does not exist" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.188601 4789 scope.go:117] "RemoveContainer" containerID="9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4" Dec 16 07:16:25 crc kubenswrapper[4789]: E1216 07:16:25.188857 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4\": container with ID starting with 9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4 not found: ID does not exist" containerID="9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.188880 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4"} err="failed to get container status \"9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4\": rpc error: code = NotFound desc = could not find container \"9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4\": container with ID starting with 9033a67b2ea46abfd6e71909cda2995de04b3ab05576db0a3c69cba1f5d7cfe4 not found: ID does not exist" Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.383815 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:16:25 crc kubenswrapper[4789]: I1216 07:16:25.389719 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 07:16:26 crc kubenswrapper[4789]: I1216 07:16:26.116084 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1894718e-3dac-4430-9285-e397fb21e852" path="/var/lib/kubelet/pods/1894718e-3dac-4430-9285-e397fb21e852/volumes" Dec 16 07:16:26 crc kubenswrapper[4789]: E1216 07:16:26.320805 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:26 crc kubenswrapper[4789]: E1216 07:16:26.320862 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:26 crc kubenswrapper[4789]: E1216 07:16:26.320892 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts podName:1964cf41-49d7-4b0d-ab8b-fbf9b621e359 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:34.320875606 +0000 UTC m=+1532.582763235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts") pod "barbican30bf-account-delete-mwl8m" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359") : configmap "openstack-scripts" not found Dec 16 07:16:26 crc kubenswrapper[4789]: E1216 07:16:26.320940 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts podName:0491a70b-b044-4ec4-b179-778967cd4573 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:34.320922467 +0000 UTC m=+1532.582810106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts") pod "novaapifc07-account-delete-bxbmv" (UID: "0491a70b-b044-4ec4-b179-778967cd4573") : configmap "openstack-scripts" not found Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.154520 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.155149 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.155467 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.155504 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.156445 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.157771 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.159595 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:27 crc kubenswrapper[4789]: E1216 07:16:27.159651 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:16:29 crc kubenswrapper[4789]: I1216 07:16:29.544500 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5787d477bc-ccrwj" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9696/\": dial tcp 10.217.0.163:9696: connect: connection refused" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.609206 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5hkcf"] Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.609738 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerName="galera" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.609775 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerName="galera" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.609814 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.609830 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.609851 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.609863 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.609882 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b89e30-7a68-4c02-8386-cc104108a8ea" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.609895 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b89e30-7a68-4c02-8386-cc104108a8ea" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.609952 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerName="rabbitmq" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.609964 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerName="rabbitmq" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.609984 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610001 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610022 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerName="mysql-bootstrap" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610036 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerName="mysql-bootstrap" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610060 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdea835b-f122-4db5-b7c1-ca180d9f3853" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610107 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdea835b-f122-4db5-b7c1-ca180d9f3853" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610134 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="sg-core" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610146 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="sg-core" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610162 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1af2cc-24b9-4786-befa-74623fca05f7" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610174 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1af2cc-24b9-4786-befa-74623fca05f7" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610187 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610200 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-api" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610217 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8680ae27-3e72-416b-9983-9b195fedcefb" containerName="nova-scheduler-scheduler" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610230 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8680ae27-3e72-416b-9983-9b195fedcefb" containerName="nova-scheduler-scheduler" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610250 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610263 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610282 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="ovn-northd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610293 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="ovn-northd" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610310 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610322 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610342 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610354 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610373 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerName="dnsmasq-dns" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610385 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerName="dnsmasq-dns" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610403 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610414 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610430 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23503f0-7f00-4d2d-830b-fed7db6e6a08" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610443 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23503f0-7f00-4d2d-830b-fed7db6e6a08" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610468 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-central-agent" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610479 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-central-agent" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610496 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerName="init" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610507 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerName="init" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610531 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610543 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610558 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610571 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-api" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610591 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2338e7-2de7-4149-bb6a-ae978c7e096a" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610603 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2338e7-2de7-4149-bb6a-ae978c7e096a" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610620 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3b1c91-5558-4afb-a9fc-dd75527451ee" containerName="memcached" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610632 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3b1c91-5558-4afb-a9fc-dd75527451ee" containerName="memcached" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610672 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="ovsdbserver-sb" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610684 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="ovsdbserver-sb" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610701 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610716 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610741 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29706741-1258-454c-968f-836e472cb685" containerName="kube-state-metrics" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610758 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="29706741-1258-454c-968f-836e472cb685" containerName="kube-state-metrics" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610785 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerName="setup-container" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610801 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerName="setup-container" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610820 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254f667d-eae3-486b-b9e8-ffc571d65635" containerName="keystone-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610834 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="254f667d-eae3-486b-b9e8-ffc571d65635" containerName="keystone-api" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610861 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610875 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610895 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610907 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610948 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerName="mysql-bootstrap" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610960 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerName="mysql-bootstrap" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.610982 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.610994 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611014 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611030 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611057 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611073 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611093 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611109 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611125 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" containerName="nova-cell1-conductor-conductor" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611137 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" containerName="nova-cell1-conductor-conductor" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611156 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611167 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611184 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611196 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611219 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="ovsdbserver-nb" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611231 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="ovsdbserver-nb" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611250 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358d8958-a563-407c-8b7f-75aee19a3699" containerName="nova-cell0-conductor-conductor" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611262 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="358d8958-a563-407c-8b7f-75aee19a3699" containerName="nova-cell0-conductor-conductor" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611275 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611287 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611301 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611313 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-log" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611329 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="cinder-scheduler" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611341 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="cinder-scheduler" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611364 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerName="setup-container" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611376 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerName="setup-container" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611395 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-metadata" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611407 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-metadata" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611426 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="probe" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611437 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="probe" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611451 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82acf941-5ce6-4e18-bc6d-1809296622eb" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611464 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="82acf941-5ce6-4e18-bc6d-1809296622eb" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611487 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-notification-agent" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611499 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-notification-agent" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611516 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611528 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611547 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00293d36-0c18-4d79-aacd-4224045ff895" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611559 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="00293d36-0c18-4d79-aacd-4224045ff895" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611581 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerName="rabbitmq" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611593 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerName="rabbitmq" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611612 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerName="galera" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611621 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerName="galera" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611639 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-server" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611647 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-server" Dec 16 07:16:30 crc kubenswrapper[4789]: E1216 07:16:30.611658 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="proxy-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611666 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="proxy-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611870 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611885 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611898 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="sg-core" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611909 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f699c71b-1e44-4a4d-b1fb-77ef105af03d" containerName="galera" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611942 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611958 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611971 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="proxy-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611980 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2338e7-2de7-4149-bb6a-ae978c7e096a" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.611990 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3b1c91-5558-4afb-a9fc-dd75527451ee" containerName="memcached" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612006 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612021 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612033 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612042 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-notification-agent" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612051 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612065 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="cinder-scheduler" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612075 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612090 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6eb716a-c1c8-47f7-bef8-68fbdf9162c0" containerName="barbican-keystone-listener" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612105 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="31336d9f-38cf-4805-927b-3ae986f6c88e" containerName="rabbitmq" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612116 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e16a3ef-920e-493a-ae2f-7336d64bbd7e" containerName="ovn-controller" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612130 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fceb99a-9dfd-4d79-a0fd-666390de4440" containerName="ovsdbserver-sb" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612143 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ebe6e6-04ff-4a7b-96cc-b4cd84aef9fe" containerName="nova-cell1-conductor-conductor" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612161 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d868c627-a661-4c69-afd7-26d88b2be0ec" containerName="galera" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612205 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="82acf941-5ce6-4e18-bc6d-1809296622eb" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612217 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9452e1b2-42ec-47b6-96e1-2770c9e76db2" containerName="rabbitmq" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612228 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="de637363-990a-4590-b9c5-ab66c18ec270" containerName="probe" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612241 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5031d0ac-42ac-4346-9403-0369a555ab4a" containerName="nova-metadata-metadata" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612254 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612269 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="37216df1-3f61-412b-bffb-5e36812383f4" containerName="glance-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612280 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc02bf7e-2d67-40a4-94b0-5807631a5b2e" containerName="proxy-server" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612290 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612301 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="ovsdbserver-nb" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612310 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8368d044-b088-48f9-b5cb-19a95b997576" containerName="placement-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612324 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612338 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1894718e-3dac-4430-9285-e397fb21e852" containerName="ceilometer-central-agent" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612348 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd018ae-71a5-4dbe-bdd6-ba7fbad1a4f5" containerName="cinder-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612360 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2a6bf4-542c-4b1c-b461-6b6f427a1d84" containerName="nova-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612371 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b89e30-7a68-4c02-8386-cc104108a8ea" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612380 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f88379-6b15-47a6-bf24-7cf0b3edc56a" containerName="ovn-northd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612392 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23503f0-7f00-4d2d-830b-fed7db6e6a08" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612400 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1af2cc-24b9-4786-befa-74623fca05f7" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612416 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdea835b-f122-4db5-b7c1-ca180d9f3853" containerName="mariadb-account-delete" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612428 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8680ae27-3e72-416b-9983-9b195fedcefb" containerName="nova-scheduler-scheduler" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612439 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="29706741-1258-454c-968f-836e472cb685" containerName="kube-state-metrics" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612449 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b0a572-437e-4d15-a74d-e92c0f39c9cc" containerName="barbican-api-log" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612459 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="358d8958-a563-407c-8b7f-75aee19a3699" containerName="nova-cell0-conductor-conductor" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612467 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6423ab7-79a3-402c-9115-e54b5f29ad05" containerName="glance-httpd" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612475 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff4de9f-c7d4-4d77-81be-7a499ead0f10" containerName="dnsmasq-dns" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612485 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00adc24-beed-43df-95a8-274b841d60a0" containerName="barbican-worker" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612498 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="254f667d-eae3-486b-b9e8-ffc571d65635" containerName="keystone-api" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612512 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6456012f-c7be-458c-a9a5-b3958ae72c2c" containerName="openstack-network-exporter" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.612523 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="00293d36-0c18-4d79-aacd-4224045ff895" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.613951 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.617975 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hkcf"] Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.791311 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpd27\" (UniqueName: \"kubernetes.io/projected/fff29c31-7432-4682-a513-a1a6dcd9b276-kube-api-access-gpd27\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.791412 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-utilities\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.791491 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-catalog-content\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.893206 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-utilities\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.893346 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-catalog-content\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.893404 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpd27\" (UniqueName: \"kubernetes.io/projected/fff29c31-7432-4682-a513-a1a6dcd9b276-kube-api-access-gpd27\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.894411 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-utilities\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.894717 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-catalog-content\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.928836 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpd27\" (UniqueName: \"kubernetes.io/projected/fff29c31-7432-4682-a513-a1a6dcd9b276-kube-api-access-gpd27\") pod \"community-operators-5hkcf\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:30 crc kubenswrapper[4789]: I1216 07:16:30.933310 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:31 crc kubenswrapper[4789]: I1216 07:16:31.511857 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5hkcf"] Dec 16 07:16:32 crc kubenswrapper[4789]: I1216 07:16:32.100066 4789 generic.go:334] "Generic (PLEG): container finished" podID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerID="3b5e42980c1164a70a51ff9e5fe051b81b82ef64a3ddd9ccb748676bfbca4c03" exitCode=0 Dec 16 07:16:32 crc kubenswrapper[4789]: I1216 07:16:32.100140 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hkcf" event={"ID":"fff29c31-7432-4682-a513-a1a6dcd9b276","Type":"ContainerDied","Data":"3b5e42980c1164a70a51ff9e5fe051b81b82ef64a3ddd9ccb748676bfbca4c03"} Dec 16 07:16:32 crc kubenswrapper[4789]: I1216 07:16:32.100360 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hkcf" event={"ID":"fff29c31-7432-4682-a513-a1a6dcd9b276","Type":"ContainerStarted","Data":"cfa8db1ac0f45b46de6b58349311b7db179fc3e84619329f5000c95de2f2525a"} Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.154524 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.155178 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.155542 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.155584 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.156867 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.158850 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.161022 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:32 crc kubenswrapper[4789]: E1216 07:16:32.161080 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:16:33 crc kubenswrapper[4789]: I1216 07:16:33.111185 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hkcf" event={"ID":"fff29c31-7432-4682-a513-a1a6dcd9b276","Type":"ContainerStarted","Data":"68cba6fd7ad3cdf11bbda4285ead1a49e2b97b2c058b2653e91d08079c8ddc92"} Dec 16 07:16:34 crc kubenswrapper[4789]: I1216 07:16:34.122264 4789 generic.go:334] "Generic (PLEG): container finished" podID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerID="68cba6fd7ad3cdf11bbda4285ead1a49e2b97b2c058b2653e91d08079c8ddc92" exitCode=0 Dec 16 07:16:34 crc kubenswrapper[4789]: I1216 07:16:34.122325 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hkcf" event={"ID":"fff29c31-7432-4682-a513-a1a6dcd9b276","Type":"ContainerDied","Data":"68cba6fd7ad3cdf11bbda4285ead1a49e2b97b2c058b2653e91d08079c8ddc92"} Dec 16 07:16:34 crc kubenswrapper[4789]: E1216 07:16:34.348659 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:34 crc kubenswrapper[4789]: E1216 07:16:34.348678 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:34 crc kubenswrapper[4789]: E1216 07:16:34.348728 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts podName:0491a70b-b044-4ec4-b179-778967cd4573 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:50.348708734 +0000 UTC m=+1548.610596363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts") pod "novaapifc07-account-delete-bxbmv" (UID: "0491a70b-b044-4ec4-b179-778967cd4573") : configmap "openstack-scripts" not found Dec 16 07:16:34 crc kubenswrapper[4789]: E1216 07:16:34.348746 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts podName:1964cf41-49d7-4b0d-ab8b-fbf9b621e359 nodeName:}" failed. No retries permitted until 2025-12-16 07:16:50.348738474 +0000 UTC m=+1548.610626103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts") pod "barbican30bf-account-delete-mwl8m" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359") : configmap "openstack-scripts" not found Dec 16 07:16:35 crc kubenswrapper[4789]: I1216 07:16:35.134397 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hkcf" event={"ID":"fff29c31-7432-4682-a513-a1a6dcd9b276","Type":"ContainerStarted","Data":"082b5cbf3962a67a6c3dedb0225decfb98e953c814b247a7aeb37ccf3c596418"} Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.164562 4789 generic.go:334] "Generic (PLEG): container finished" podID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerID="1811fc6d133a6d47f93c7b8e7704ffe66b0cb1ade5e47088042f32756e1e0944" exitCode=0 Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.164673 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5787d477bc-ccrwj" event={"ID":"73660d16-d925-4e43-8df7-2c40959bb7ed","Type":"ContainerDied","Data":"1811fc6d133a6d47f93c7b8e7704ffe66b0cb1ade5e47088042f32756e1e0944"} Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.275519 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.297027 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5hkcf" podStartSLOduration=3.77999001 podStartE2EDuration="6.297010096s" podCreationTimestamp="2025-12-16 07:16:30 +0000 UTC" firstStartedPulling="2025-12-16 07:16:32.10314126 +0000 UTC m=+1530.365028889" lastFinishedPulling="2025-12-16 07:16:34.620161346 +0000 UTC m=+1532.882048975" observedRunningTime="2025-12-16 07:16:35.159843484 +0000 UTC m=+1533.421731113" watchObservedRunningTime="2025-12-16 07:16:36.297010096 +0000 UTC m=+1534.558897725" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.387419 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-internal-tls-certs\") pod \"73660d16-d925-4e43-8df7-2c40959bb7ed\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.387478 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnhs2\" (UniqueName: \"kubernetes.io/projected/73660d16-d925-4e43-8df7-2c40959bb7ed-kube-api-access-hnhs2\") pod \"73660d16-d925-4e43-8df7-2c40959bb7ed\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.387509 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-config\") pod \"73660d16-d925-4e43-8df7-2c40959bb7ed\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.387561 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-combined-ca-bundle\") pod \"73660d16-d925-4e43-8df7-2c40959bb7ed\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.387587 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-ovndb-tls-certs\") pod \"73660d16-d925-4e43-8df7-2c40959bb7ed\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.387606 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-public-tls-certs\") pod \"73660d16-d925-4e43-8df7-2c40959bb7ed\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.387682 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-httpd-config\") pod \"73660d16-d925-4e43-8df7-2c40959bb7ed\" (UID: \"73660d16-d925-4e43-8df7-2c40959bb7ed\") " Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.395102 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73660d16-d925-4e43-8df7-2c40959bb7ed-kube-api-access-hnhs2" (OuterVolumeSpecName: "kube-api-access-hnhs2") pod "73660d16-d925-4e43-8df7-2c40959bb7ed" (UID: "73660d16-d925-4e43-8df7-2c40959bb7ed"). InnerVolumeSpecName "kube-api-access-hnhs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.395094 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "73660d16-d925-4e43-8df7-2c40959bb7ed" (UID: "73660d16-d925-4e43-8df7-2c40959bb7ed"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.431076 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-config" (OuterVolumeSpecName: "config") pod "73660d16-d925-4e43-8df7-2c40959bb7ed" (UID: "73660d16-d925-4e43-8df7-2c40959bb7ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.433192 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "73660d16-d925-4e43-8df7-2c40959bb7ed" (UID: "73660d16-d925-4e43-8df7-2c40959bb7ed"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.445253 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73660d16-d925-4e43-8df7-2c40959bb7ed" (UID: "73660d16-d925-4e43-8df7-2c40959bb7ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.451015 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "73660d16-d925-4e43-8df7-2c40959bb7ed" (UID: "73660d16-d925-4e43-8df7-2c40959bb7ed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.473397 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "73660d16-d925-4e43-8df7-2c40959bb7ed" (UID: "73660d16-d925-4e43-8df7-2c40959bb7ed"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.490142 4789 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.490191 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnhs2\" (UniqueName: \"kubernetes.io/projected/73660d16-d925-4e43-8df7-2c40959bb7ed-kube-api-access-hnhs2\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.490207 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.490218 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.490230 4789 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.490240 4789 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:36 crc kubenswrapper[4789]: I1216 07:16:36.490252 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73660d16-d925-4e43-8df7-2c40959bb7ed-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.156015 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.156719 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.160219 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.160294 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.168526 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.180029 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.192174 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:37 crc kubenswrapper[4789]: E1216 07:16:37.192230 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:16:37 crc kubenswrapper[4789]: I1216 07:16:37.196343 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5787d477bc-ccrwj" event={"ID":"73660d16-d925-4e43-8df7-2c40959bb7ed","Type":"ContainerDied","Data":"7b507608a565ab382c678b5753862f64a6fdcc81cac0a58dd680a8bde4b844de"} Dec 16 07:16:37 crc kubenswrapper[4789]: I1216 07:16:37.196401 4789 scope.go:117] "RemoveContainer" containerID="d61236e0a1b169ed76d6b190800ead5dd0b19f9acc8d953c9f3b75b5c79591fd" Dec 16 07:16:37 crc kubenswrapper[4789]: I1216 07:16:37.196561 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5787d477bc-ccrwj" Dec 16 07:16:37 crc kubenswrapper[4789]: I1216 07:16:37.262567 4789 scope.go:117] "RemoveContainer" containerID="1811fc6d133a6d47f93c7b8e7704ffe66b0cb1ade5e47088042f32756e1e0944" Dec 16 07:16:37 crc kubenswrapper[4789]: I1216 07:16:37.300525 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5787d477bc-ccrwj"] Dec 16 07:16:37 crc kubenswrapper[4789]: I1216 07:16:37.309792 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5787d477bc-ccrwj"] Dec 16 07:16:38 crc kubenswrapper[4789]: I1216 07:16:38.113875 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" path="/var/lib/kubelet/pods/73660d16-d925-4e43-8df7-2c40959bb7ed/volumes" Dec 16 07:16:40 crc kubenswrapper[4789]: I1216 07:16:40.935141 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:40 crc kubenswrapper[4789]: I1216 07:16:40.935361 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:40 crc kubenswrapper[4789]: I1216 07:16:40.978231 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:41 crc kubenswrapper[4789]: I1216 07:16:41.268018 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:41 crc kubenswrapper[4789]: I1216 07:16:41.321516 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hkcf"] Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.153809 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.154548 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.155135 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.155243 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.155266 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.156730 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.157931 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 16 07:16:42 crc kubenswrapper[4789]: E1216 07:16:42.157990 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tblns" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:16:43 crc kubenswrapper[4789]: I1216 07:16:43.240707 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5hkcf" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="registry-server" containerID="cri-o://082b5cbf3962a67a6c3dedb0225decfb98e953c814b247a7aeb37ccf3c596418" gracePeriod=2 Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.272429 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerID="abff080aef14c07b0b737efd0a65faff826c48715b5f1c2ab9b91640d17f6623" exitCode=137 Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.272856 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"abff080aef14c07b0b737efd0a65faff826c48715b5f1c2ab9b91640d17f6623"} Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.280183 4789 generic.go:334] "Generic (PLEG): container finished" podID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerID="082b5cbf3962a67a6c3dedb0225decfb98e953c814b247a7aeb37ccf3c596418" exitCode=0 Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.280248 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hkcf" event={"ID":"fff29c31-7432-4682-a513-a1a6dcd9b276","Type":"ContainerDied","Data":"082b5cbf3962a67a6c3dedb0225decfb98e953c814b247a7aeb37ccf3c596418"} Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.286093 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tblns_b5429404-d973-4580-961a-8ad6081e93ec/ovs-vswitchd/0.log" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.286830 4789 generic.go:334] "Generic (PLEG): container finished" podID="b5429404-d973-4580-961a-8ad6081e93ec" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" exitCode=137 Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.286873 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tblns" event={"ID":"b5429404-d973-4580-961a-8ad6081e93ec","Type":"ContainerDied","Data":"4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee"} Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.579346 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.625050 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpd27\" (UniqueName: \"kubernetes.io/projected/fff29c31-7432-4682-a513-a1a6dcd9b276-kube-api-access-gpd27\") pod \"fff29c31-7432-4682-a513-a1a6dcd9b276\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.625131 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-catalog-content\") pod \"fff29c31-7432-4682-a513-a1a6dcd9b276\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.625227 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-utilities\") pod \"fff29c31-7432-4682-a513-a1a6dcd9b276\" (UID: \"fff29c31-7432-4682-a513-a1a6dcd9b276\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.626624 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-utilities" (OuterVolumeSpecName: "utilities") pod "fff29c31-7432-4682-a513-a1a6dcd9b276" (UID: "fff29c31-7432-4682-a513-a1a6dcd9b276"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.632876 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff29c31-7432-4682-a513-a1a6dcd9b276-kube-api-access-gpd27" (OuterVolumeSpecName: "kube-api-access-gpd27") pod "fff29c31-7432-4682-a513-a1a6dcd9b276" (UID: "fff29c31-7432-4682-a513-a1a6dcd9b276"). InnerVolumeSpecName "kube-api-access-gpd27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.687823 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.702263 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fff29c31-7432-4682-a513-a1a6dcd9b276" (UID: "fff29c31-7432-4682-a513-a1a6dcd9b276"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.736055 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-cache\") pod \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.736100 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4pkq\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-kube-api-access-h4pkq\") pod \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.736134 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.736150 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") pod \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.736710 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-lock\") pod \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\" (UID: \"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.737324 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-cache" (OuterVolumeSpecName: "cache") pod "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.738465 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpd27\" (UniqueName: \"kubernetes.io/projected/fff29c31-7432-4682-a513-a1a6dcd9b276-kube-api-access-gpd27\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.738508 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.738523 4789 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-cache\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.738536 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff29c31-7432-4682-a513-a1a6dcd9b276-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.739788 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-lock" (OuterVolumeSpecName: "lock") pod "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.743071 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-kube-api-access-h4pkq" (OuterVolumeSpecName: "kube-api-access-h4pkq") pod "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0"). InnerVolumeSpecName "kube-api-access-h4pkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.743120 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.743134 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" (UID: "cbd6bd33-5f98-4eb6-9fee-5080941ee4c0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.787632 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tblns_b5429404-d973-4580-961a-8ad6081e93ec/ovs-vswitchd/0.log" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.788669 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840009 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5429404-d973-4580-961a-8ad6081e93ec-scripts\") pod \"b5429404-d973-4580-961a-8ad6081e93ec\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840097 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-run\") pod \"b5429404-d973-4580-961a-8ad6081e93ec\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840119 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-lib\") pod \"b5429404-d973-4580-961a-8ad6081e93ec\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840147 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-log\") pod \"b5429404-d973-4580-961a-8ad6081e93ec\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840177 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-etc-ovs\") pod \"b5429404-d973-4580-961a-8ad6081e93ec\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840216 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bfrk\" (UniqueName: \"kubernetes.io/projected/b5429404-d973-4580-961a-8ad6081e93ec-kube-api-access-5bfrk\") pod \"b5429404-d973-4580-961a-8ad6081e93ec\" (UID: \"b5429404-d973-4580-961a-8ad6081e93ec\") " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840204 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-run" (OuterVolumeSpecName: "var-run") pod "b5429404-d973-4580-961a-8ad6081e93ec" (UID: "b5429404-d973-4580-961a-8ad6081e93ec"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840282 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-log" (OuterVolumeSpecName: "var-log") pod "b5429404-d973-4580-961a-8ad6081e93ec" (UID: "b5429404-d973-4580-961a-8ad6081e93ec"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840310 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-lib" (OuterVolumeSpecName: "var-lib") pod "b5429404-d973-4580-961a-8ad6081e93ec" (UID: "b5429404-d973-4580-961a-8ad6081e93ec"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840333 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "b5429404-d973-4580-961a-8ad6081e93ec" (UID: "b5429404-d973-4580-961a-8ad6081e93ec"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840577 4789 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-lock\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840596 4789 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840607 4789 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-lib\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840618 4789 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-var-log\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840628 4789 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b5429404-d973-4580-961a-8ad6081e93ec-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840640 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4pkq\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-kube-api-access-h4pkq\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840662 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.840673 4789 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.841109 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5429404-d973-4580-961a-8ad6081e93ec-scripts" (OuterVolumeSpecName: "scripts") pod "b5429404-d973-4580-961a-8ad6081e93ec" (UID: "b5429404-d973-4580-961a-8ad6081e93ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.844040 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5429404-d973-4580-961a-8ad6081e93ec-kube-api-access-5bfrk" (OuterVolumeSpecName: "kube-api-access-5bfrk") pod "b5429404-d973-4580-961a-8ad6081e93ec" (UID: "b5429404-d973-4580-961a-8ad6081e93ec"). InnerVolumeSpecName "kube-api-access-5bfrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.854046 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.942529 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.942574 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5429404-d973-4580-961a-8ad6081e93ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:45 crc kubenswrapper[4789]: I1216 07:16:45.942622 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bfrk\" (UniqueName: \"kubernetes.io/projected/b5429404-d973-4580-961a-8ad6081e93ec-kube-api-access-5bfrk\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.303676 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cbd6bd33-5f98-4eb6-9fee-5080941ee4c0","Type":"ContainerDied","Data":"aa014e0d284e6ea46e4838e5e30274d9085c41e7e66e8dced92e2bba1d40352d"} Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.303724 4789 scope.go:117] "RemoveContainer" containerID="abff080aef14c07b0b737efd0a65faff826c48715b5f1c2ab9b91640d17f6623" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.304035 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.306389 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5hkcf" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.306467 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5hkcf" event={"ID":"fff29c31-7432-4682-a513-a1a6dcd9b276","Type":"ContainerDied","Data":"cfa8db1ac0f45b46de6b58349311b7db179fc3e84619329f5000c95de2f2525a"} Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.309120 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tblns_b5429404-d973-4580-961a-8ad6081e93ec/ovs-vswitchd/0.log" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.312054 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tblns" event={"ID":"b5429404-d973-4580-961a-8ad6081e93ec","Type":"ContainerDied","Data":"1e9ca768581b07a47cbe4eb52da147b3dc82157f500ff347c14e7e5647f13dd3"} Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.312114 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tblns" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.330964 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5hkcf"] Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.333039 4789 scope.go:117] "RemoveContainer" containerID="d8238af7dbf15f23415f0c86259fcf9957fbc0b08bcb581d4f0624333c152ec1" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.337591 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5hkcf"] Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.348517 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.354609 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.355689 4789 scope.go:117] "RemoveContainer" containerID="58f8b4cb7ddbfc39c3c2c236d8c52319b46445fa6bd8e36d14a249780702ad85" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.368398 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tblns"] Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.384202 4789 scope.go:117] "RemoveContainer" containerID="7f58e5c14558f31f6600906b48eb2e6f74d0e6249665f123eef015ba515b9e8b" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.386653 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tblns"] Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.401221 4789 scope.go:117] "RemoveContainer" containerID="bf3fe2408d858c60b990dfb63b6c210d31747a7a36a94cb83c07d547d090370f" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.422697 4789 scope.go:117] "RemoveContainer" containerID="dfd251c4b8cc4551da74250c7e1018cc05d1c34c1749b00d7314e5704a70d11c" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.438098 4789 scope.go:117] "RemoveContainer" containerID="c05e3cfb0b0446d45c6b1efc03786be1905a9914fbbf8eca279bc89ee3642716" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.455894 4789 scope.go:117] "RemoveContainer" containerID="edcd02c79a5409469199dea08015de9c6eeffbea5566bd3cd4db97a260e47fdd" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.472554 4789 scope.go:117] "RemoveContainer" containerID="ec371978a44bd2c62cd3ea38c393bf36090b055edd6151b95aa9b353fbdb7387" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.490921 4789 scope.go:117] "RemoveContainer" containerID="e0c8a6f56c8022db43b02bf2bd015331c0cdd2235c3eca42b9e1e1f7f8bd3705" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.506419 4789 scope.go:117] "RemoveContainer" containerID="7428e5236584f2fe103930cb1f61dd303456f8c0deb11b5bbb601d51deecfb66" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.560791 4789 scope.go:117] "RemoveContainer" containerID="20593004d226e1585979c62630548d692855df2932aab4c7c86476377d9cc2cc" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.574824 4789 scope.go:117] "RemoveContainer" containerID="b93f37726e0744613bc7b449e38506e91bd311f3c6efe8bbf38923fdf51b2146" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.595532 4789 scope.go:117] "RemoveContainer" containerID="391f051ceefce6af95f3e5e5fc2ba9a787ede01ec802f107f998941a77f4283e" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.613721 4789 scope.go:117] "RemoveContainer" containerID="7dd74cf2b547abd9c20fc6d29daa7d954817be3444474dc3629c37701cc99230" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.633395 4789 scope.go:117] "RemoveContainer" containerID="082b5cbf3962a67a6c3dedb0225decfb98e953c814b247a7aeb37ccf3c596418" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.649350 4789 scope.go:117] "RemoveContainer" containerID="68cba6fd7ad3cdf11bbda4285ead1a49e2b97b2c058b2653e91d08079c8ddc92" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.665794 4789 scope.go:117] "RemoveContainer" containerID="3b5e42980c1164a70a51ff9e5fe051b81b82ef64a3ddd9ccb748676bfbca4c03" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.691372 4789 scope.go:117] "RemoveContainer" containerID="4feaaf62ab9531f09183f14ea642a9161caffa80f42b542598d0d6f086e490ee" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.715676 4789 scope.go:117] "RemoveContainer" containerID="b64d427c2778ccdcefce635254baa55bf4ac6198ba5e0351e2b9a7f1ef235652" Dec 16 07:16:46 crc kubenswrapper[4789]: I1216 07:16:46.732277 4789 scope.go:117] "RemoveContainer" containerID="c46ade5ea8d28fc7bb2d2be5d8b1d0eb0b2a1ed94708b13e6fcd4a9538e9d515" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.650967 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.739008 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.769760 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data-custom\") pod \"c5bd2649-9508-49bb-833e-7239b7d11d78\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.769817 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2mp2\" (UniqueName: \"kubernetes.io/projected/24f668c2-651f-48f2-8feb-7faa470c3a19-kube-api-access-w2mp2\") pod \"24f668c2-651f-48f2-8feb-7faa470c3a19\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.769840 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f668c2-651f-48f2-8feb-7faa470c3a19-logs\") pod \"24f668c2-651f-48f2-8feb-7faa470c3a19\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.769864 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data\") pod \"c5bd2649-9508-49bb-833e-7239b7d11d78\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.769945 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5bd2649-9508-49bb-833e-7239b7d11d78-logs\") pod \"c5bd2649-9508-49bb-833e-7239b7d11d78\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.769963 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data-custom\") pod \"24f668c2-651f-48f2-8feb-7faa470c3a19\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.769998 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data\") pod \"24f668c2-651f-48f2-8feb-7faa470c3a19\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.770024 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqflz\" (UniqueName: \"kubernetes.io/projected/c5bd2649-9508-49bb-833e-7239b7d11d78-kube-api-access-nqflz\") pod \"c5bd2649-9508-49bb-833e-7239b7d11d78\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.770124 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-combined-ca-bundle\") pod \"c5bd2649-9508-49bb-833e-7239b7d11d78\" (UID: \"c5bd2649-9508-49bb-833e-7239b7d11d78\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.770157 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-combined-ca-bundle\") pod \"24f668c2-651f-48f2-8feb-7faa470c3a19\" (UID: \"24f668c2-651f-48f2-8feb-7faa470c3a19\") " Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.770541 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24f668c2-651f-48f2-8feb-7faa470c3a19-logs" (OuterVolumeSpecName: "logs") pod "24f668c2-651f-48f2-8feb-7faa470c3a19" (UID: "24f668c2-651f-48f2-8feb-7faa470c3a19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.770553 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5bd2649-9508-49bb-833e-7239b7d11d78-logs" (OuterVolumeSpecName: "logs") pod "c5bd2649-9508-49bb-833e-7239b7d11d78" (UID: "c5bd2649-9508-49bb-833e-7239b7d11d78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.775331 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "24f668c2-651f-48f2-8feb-7faa470c3a19" (UID: "24f668c2-651f-48f2-8feb-7faa470c3a19"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.775370 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c5bd2649-9508-49bb-833e-7239b7d11d78" (UID: "c5bd2649-9508-49bb-833e-7239b7d11d78"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.777299 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f668c2-651f-48f2-8feb-7faa470c3a19-kube-api-access-w2mp2" (OuterVolumeSpecName: "kube-api-access-w2mp2") pod "24f668c2-651f-48f2-8feb-7faa470c3a19" (UID: "24f668c2-651f-48f2-8feb-7faa470c3a19"). InnerVolumeSpecName "kube-api-access-w2mp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.778082 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5bd2649-9508-49bb-833e-7239b7d11d78-kube-api-access-nqflz" (OuterVolumeSpecName: "kube-api-access-nqflz") pod "c5bd2649-9508-49bb-833e-7239b7d11d78" (UID: "c5bd2649-9508-49bb-833e-7239b7d11d78"). InnerVolumeSpecName "kube-api-access-nqflz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.790087 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24f668c2-651f-48f2-8feb-7faa470c3a19" (UID: "24f668c2-651f-48f2-8feb-7faa470c3a19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.795845 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5bd2649-9508-49bb-833e-7239b7d11d78" (UID: "c5bd2649-9508-49bb-833e-7239b7d11d78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.807502 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data" (OuterVolumeSpecName: "config-data") pod "c5bd2649-9508-49bb-833e-7239b7d11d78" (UID: "c5bd2649-9508-49bb-833e-7239b7d11d78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.809216 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data" (OuterVolumeSpecName: "config-data") pod "24f668c2-651f-48f2-8feb-7faa470c3a19" (UID: "24f668c2-651f-48f2-8feb-7faa470c3a19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871324 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5bd2649-9508-49bb-833e-7239b7d11d78-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871350 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871362 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871371 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqflz\" (UniqueName: \"kubernetes.io/projected/c5bd2649-9508-49bb-833e-7239b7d11d78-kube-api-access-nqflz\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871380 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871388 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f668c2-651f-48f2-8feb-7faa470c3a19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871396 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871404 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2mp2\" (UniqueName: \"kubernetes.io/projected/24f668c2-651f-48f2-8feb-7faa470c3a19-kube-api-access-w2mp2\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871411 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24f668c2-651f-48f2-8feb-7faa470c3a19-logs\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:47 crc kubenswrapper[4789]: I1216 07:16:47.871419 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5bd2649-9508-49bb-833e-7239b7d11d78-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.116564 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5429404-d973-4580-961a-8ad6081e93ec" path="/var/lib/kubelet/pods/b5429404-d973-4580-961a-8ad6081e93ec/volumes" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.118083 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" path="/var/lib/kubelet/pods/cbd6bd33-5f98-4eb6-9fee-5080941ee4c0/volumes" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.120813 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" path="/var/lib/kubelet/pods/fff29c31-7432-4682-a513-a1a6dcd9b276/volumes" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.332271 4789 generic.go:334] "Generic (PLEG): container finished" podID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerID="cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d" exitCode=137 Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.332328 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccb68857-5qpdn" event={"ID":"c5bd2649-9508-49bb-833e-7239b7d11d78","Type":"ContainerDied","Data":"cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d"} Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.332355 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccb68857-5qpdn" event={"ID":"c5bd2649-9508-49bb-833e-7239b7d11d78","Type":"ContainerDied","Data":"02f3a045768469bcb82812149404958f11cc790ef72b1c282478eb934a9bafb0"} Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.332371 4789 scope.go:117] "RemoveContainer" containerID="cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.332330 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccb68857-5qpdn" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.336419 4789 generic.go:334] "Generic (PLEG): container finished" podID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerID="b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7" exitCode=137 Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.336451 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" event={"ID":"24f668c2-651f-48f2-8feb-7faa470c3a19","Type":"ContainerDied","Data":"b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7"} Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.336472 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" event={"ID":"24f668c2-651f-48f2-8feb-7faa470c3a19","Type":"ContainerDied","Data":"89604f15dd7e98d8241fb24132a1f1a8b21deee7462d248bc8212e3dbb60ee2c"} Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.336519 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5978f7f754-pzhh6" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.366841 4789 scope.go:117] "RemoveContainer" containerID="f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.368488 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6ccb68857-5qpdn"] Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.375509 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6ccb68857-5qpdn"] Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.380982 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5978f7f754-pzhh6"] Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.382312 4789 scope.go:117] "RemoveContainer" containerID="cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d" Dec 16 07:16:48 crc kubenswrapper[4789]: E1216 07:16:48.382751 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d\": container with ID starting with cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d not found: ID does not exist" containerID="cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.382787 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d"} err="failed to get container status \"cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d\": rpc error: code = NotFound desc = could not find container \"cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d\": container with ID starting with cdd4dda9bebcf90c6145610a976e32bb32f2a294a12d09ca66e02e46abf9c44d not found: ID does not exist" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.382811 4789 scope.go:117] "RemoveContainer" containerID="f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1" Dec 16 07:16:48 crc kubenswrapper[4789]: E1216 07:16:48.383285 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1\": container with ID starting with f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1 not found: ID does not exist" containerID="f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.383315 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1"} err="failed to get container status \"f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1\": rpc error: code = NotFound desc = could not find container \"f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1\": container with ID starting with f6548787f3de2d60eb182246f7587e416ab72b9577b39482072e20e291299cf1 not found: ID does not exist" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.383337 4789 scope.go:117] "RemoveContainer" containerID="b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.388076 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5978f7f754-pzhh6"] Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.401094 4789 scope.go:117] "RemoveContainer" containerID="50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.416812 4789 scope.go:117] "RemoveContainer" containerID="b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7" Dec 16 07:16:48 crc kubenswrapper[4789]: E1216 07:16:48.417182 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7\": container with ID starting with b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7 not found: ID does not exist" containerID="b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.417293 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7"} err="failed to get container status \"b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7\": rpc error: code = NotFound desc = could not find container \"b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7\": container with ID starting with b5dd8dd8dd5b93eddc5b39aa94f0b38091772129ea69c2995ace575bcd1664d7 not found: ID does not exist" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.417377 4789 scope.go:117] "RemoveContainer" containerID="50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a" Dec 16 07:16:48 crc kubenswrapper[4789]: E1216 07:16:48.417833 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a\": container with ID starting with 50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a not found: ID does not exist" containerID="50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a" Dec 16 07:16:48 crc kubenswrapper[4789]: I1216 07:16:48.417868 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a"} err="failed to get container status \"50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a\": rpc error: code = NotFound desc = could not find container \"50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a\": container with ID starting with 50c4f312f41ee8c3ca3c30d1a889d4f965b524305d11e2030b0578fb377ffd0a not found: ID does not exist" Dec 16 07:16:50 crc kubenswrapper[4789]: I1216 07:16:50.127720 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" path="/var/lib/kubelet/pods/24f668c2-651f-48f2-8feb-7faa470c3a19/volumes" Dec 16 07:16:50 crc kubenswrapper[4789]: I1216 07:16:50.128621 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" path="/var/lib/kubelet/pods/c5bd2649-9508-49bb-833e-7239b7d11d78/volumes" Dec 16 07:16:50 crc kubenswrapper[4789]: E1216 07:16:50.414350 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:50 crc kubenswrapper[4789]: E1216 07:16:50.414407 4789 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 16 07:16:50 crc kubenswrapper[4789]: E1216 07:16:50.414774 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts podName:0491a70b-b044-4ec4-b179-778967cd4573 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:22.414756562 +0000 UTC m=+1580.676644191 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts") pod "novaapifc07-account-delete-bxbmv" (UID: "0491a70b-b044-4ec4-b179-778967cd4573") : configmap "openstack-scripts" not found Dec 16 07:16:50 crc kubenswrapper[4789]: E1216 07:16:50.414863 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts podName:1964cf41-49d7-4b0d-ab8b-fbf9b621e359 nodeName:}" failed. No retries permitted until 2025-12-16 07:17:22.414841364 +0000 UTC m=+1580.676729003 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts") pod "barbican30bf-account-delete-mwl8m" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359") : configmap "openstack-scripts" not found Dec 16 07:16:52 crc kubenswrapper[4789]: I1216 07:16:52.259260 4789 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod9452e1b2-42ec-47b6-96e1-2770c9e76db2"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod9452e1b2-42ec-47b6-96e1-2770c9e76db2] : Timed out while waiting for systemd to remove kubepods-burstable-pod9452e1b2_42ec_47b6_96e1_2770c9e76db2.slice" Dec 16 07:16:52 crc kubenswrapper[4789]: I1216 07:16:52.912290 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:52 crc kubenswrapper[4789]: I1216 07:16:52.950362 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jn5h\" (UniqueName: \"kubernetes.io/projected/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-kube-api-access-8jn5h\") pod \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " Dec 16 07:16:52 crc kubenswrapper[4789]: I1216 07:16:52.950957 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts\") pod \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\" (UID: \"1964cf41-49d7-4b0d-ab8b-fbf9b621e359\") " Dec 16 07:16:52 crc kubenswrapper[4789]: I1216 07:16:52.951148 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:52 crc kubenswrapper[4789]: I1216 07:16:52.951819 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1964cf41-49d7-4b0d-ab8b-fbf9b621e359" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:52 crc kubenswrapper[4789]: I1216 07:16:52.958734 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-kube-api-access-8jn5h" (OuterVolumeSpecName: "kube-api-access-8jn5h") pod "1964cf41-49d7-4b0d-ab8b-fbf9b621e359" (UID: "1964cf41-49d7-4b0d-ab8b-fbf9b621e359"). InnerVolumeSpecName "kube-api-access-8jn5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.052440 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts\") pod \"0491a70b-b044-4ec4-b179-778967cd4573\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.052577 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9dvd\" (UniqueName: \"kubernetes.io/projected/0491a70b-b044-4ec4-b179-778967cd4573-kube-api-access-q9dvd\") pod \"0491a70b-b044-4ec4-b179-778967cd4573\" (UID: \"0491a70b-b044-4ec4-b179-778967cd4573\") " Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.053026 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0491a70b-b044-4ec4-b179-778967cd4573" (UID: "0491a70b-b044-4ec4-b179-778967cd4573"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.053495 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jn5h\" (UniqueName: \"kubernetes.io/projected/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-kube-api-access-8jn5h\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.053512 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1964cf41-49d7-4b0d-ab8b-fbf9b621e359-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.053543 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0491a70b-b044-4ec4-b179-778967cd4573-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.055545 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0491a70b-b044-4ec4-b179-778967cd4573-kube-api-access-q9dvd" (OuterVolumeSpecName: "kube-api-access-q9dvd") pod "0491a70b-b044-4ec4-b179-778967cd4573" (UID: "0491a70b-b044-4ec4-b179-778967cd4573"). InnerVolumeSpecName "kube-api-access-q9dvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.154647 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9dvd\" (UniqueName: \"kubernetes.io/projected/0491a70b-b044-4ec4-b179-778967cd4573-kube-api-access-q9dvd\") on node \"crc\" DevicePath \"\"" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.380743 4789 generic.go:334] "Generic (PLEG): container finished" podID="0491a70b-b044-4ec4-b179-778967cd4573" containerID="e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7" exitCode=137 Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.380806 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifc07-account-delete-bxbmv" event={"ID":"0491a70b-b044-4ec4-b179-778967cd4573","Type":"ContainerDied","Data":"e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7"} Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.380834 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifc07-account-delete-bxbmv" event={"ID":"0491a70b-b044-4ec4-b179-778967cd4573","Type":"ContainerDied","Data":"0930e632f5911a19af40b37bfd265349ca7f713b1462c8df6cc8e79a6d764317"} Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.380805 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifc07-account-delete-bxbmv" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.380854 4789 scope.go:117] "RemoveContainer" containerID="e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.382498 4789 generic.go:334] "Generic (PLEG): container finished" podID="1964cf41-49d7-4b0d-ab8b-fbf9b621e359" containerID="c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2" exitCode=137 Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.382528 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican30bf-account-delete-mwl8m" event={"ID":"1964cf41-49d7-4b0d-ab8b-fbf9b621e359","Type":"ContainerDied","Data":"c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2"} Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.382532 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican30bf-account-delete-mwl8m" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.382546 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican30bf-account-delete-mwl8m" event={"ID":"1964cf41-49d7-4b0d-ab8b-fbf9b621e359","Type":"ContainerDied","Data":"1f176ee0624459f5cffe99a925013e3dc2f1324da883fbea30790016109ae3fa"} Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.398952 4789 scope.go:117] "RemoveContainer" containerID="e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7" Dec 16 07:16:53 crc kubenswrapper[4789]: E1216 07:16:53.399293 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7\": container with ID starting with e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7 not found: ID does not exist" containerID="e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.399326 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7"} err="failed to get container status \"e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7\": rpc error: code = NotFound desc = could not find container \"e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7\": container with ID starting with e157e95877e3e28cff617d85501a60a8a5b712ed0abffeffa3658901dda2f7d7 not found: ID does not exist" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.399348 4789 scope.go:117] "RemoveContainer" containerID="c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.413325 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapifc07-account-delete-bxbmv"] Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.422493 4789 scope.go:117] "RemoveContainer" containerID="c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2" Dec 16 07:16:53 crc kubenswrapper[4789]: E1216 07:16:53.423066 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2\": container with ID starting with c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2 not found: ID does not exist" containerID="c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.423101 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2"} err="failed to get container status \"c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2\": rpc error: code = NotFound desc = could not find container \"c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2\": container with ID starting with c77365706aae68d61959bd0c9d68f21306ec4ad127dfd1688585a39ca59028e2 not found: ID does not exist" Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.424665 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapifc07-account-delete-bxbmv"] Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.429702 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican30bf-account-delete-mwl8m"] Dec 16 07:16:53 crc kubenswrapper[4789]: I1216 07:16:53.434092 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican30bf-account-delete-mwl8m"] Dec 16 07:16:54 crc kubenswrapper[4789]: I1216 07:16:54.112569 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0491a70b-b044-4ec4-b179-778967cd4573" path="/var/lib/kubelet/pods/0491a70b-b044-4ec4-b179-778967cd4573/volumes" Dec 16 07:16:54 crc kubenswrapper[4789]: I1216 07:16:54.113053 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1964cf41-49d7-4b0d-ab8b-fbf9b621e359" path="/var/lib/kubelet/pods/1964cf41-49d7-4b0d-ab8b-fbf9b621e359/volumes" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.162160 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p7mwc"] Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163052 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163071 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163085 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="extract-utilities" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163093 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="extract-utilities" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163105 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0491a70b-b044-4ec4-b179-778967cd4573" containerName="mariadb-account-delete" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163113 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0491a70b-b044-4ec4-b179-778967cd4573" containerName="mariadb-account-delete" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163123 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-reaper" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163131 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-reaper" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163146 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163153 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163167 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163174 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163188 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-expirer" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163195 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-expirer" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163205 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1964cf41-49d7-4b0d-ab8b-fbf9b621e359" containerName="mariadb-account-delete" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163213 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1964cf41-49d7-4b0d-ab8b-fbf9b621e359" containerName="mariadb-account-delete" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163227 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker-log" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163236 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker-log" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163248 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163256 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163266 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163274 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-server" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163290 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="registry-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163297 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="registry-server" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163309 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163316 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163327 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="swift-recon-cron" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163335 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="swift-recon-cron" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163344 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163351 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163361 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163367 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-server" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163377 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server-init" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163384 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server-init" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163397 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163404 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-server" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163412 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-updater" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163419 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-updater" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163427 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-api" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163435 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-api" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163445 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="rsync" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163451 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="rsync" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163465 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163472 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163486 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-httpd" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163493 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-httpd" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163503 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-updater" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163509 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-updater" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163522 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163528 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163541 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163547 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163559 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163566 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163577 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener-log" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163585 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener-log" Dec 16 07:17:01 crc kubenswrapper[4789]: E1216 07:17:01.163595 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="extract-content" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163602 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="extract-content" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163748 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163761 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff29c31-7432-4682-a513-a1a6dcd9b276" containerName="registry-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163776 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163786 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-api" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163799 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163807 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="swift-recon-cron" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163815 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f668c2-651f-48f2-8feb-7faa470c3a19" containerName="barbican-keystone-listener-log" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163827 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker-log" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163833 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163843 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovs-vswitchd" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163850 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="rsync" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163859 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-updater" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163873 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-reaper" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163881 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5429404-d973-4580-961a-8ad6081e93ec" containerName="ovsdb-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163890 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="73660d16-d925-4e43-8df7-2c40959bb7ed" containerName="neutron-httpd" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163901 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163937 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-expirer" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163945 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="account-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163954 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="container-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163964 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-updater" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163974 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0491a70b-b044-4ec4-b179-778967cd4573" containerName="mariadb-account-delete" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163981 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1964cf41-49d7-4b0d-ab8b-fbf9b621e359" containerName="mariadb-account-delete" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.163990 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5bd2649-9508-49bb-833e-7239b7d11d78" containerName="barbican-worker" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.164000 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-replicator" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.164009 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-auditor" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.164021 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd6bd33-5f98-4eb6-9fee-5080941ee4c0" containerName="object-server" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.165041 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.174404 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7mwc"] Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.270224 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-catalog-content\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.270314 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtz5\" (UniqueName: \"kubernetes.io/projected/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-kube-api-access-cbtz5\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.270352 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-utilities\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.371396 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-catalog-content\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.371470 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtz5\" (UniqueName: \"kubernetes.io/projected/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-kube-api-access-cbtz5\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.371497 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-utilities\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.371997 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-catalog-content\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.372034 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-utilities\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.394343 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtz5\" (UniqueName: \"kubernetes.io/projected/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-kube-api-access-cbtz5\") pod \"redhat-marketplace-p7mwc\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.481846 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:01 crc kubenswrapper[4789]: I1216 07:17:01.937653 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7mwc"] Dec 16 07:17:02 crc kubenswrapper[4789]: I1216 07:17:02.456066 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7mwc" event={"ID":"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781","Type":"ContainerStarted","Data":"ee8febeecb0c8bc3347b6e0ba35977bd72aead4e81eeb08b2abbb402c2464ab0"} Dec 16 07:17:03 crc kubenswrapper[4789]: I1216 07:17:03.466556 4789 generic.go:334] "Generic (PLEG): container finished" podID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerID="2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4" exitCode=0 Dec 16 07:17:03 crc kubenswrapper[4789]: I1216 07:17:03.466610 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7mwc" event={"ID":"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781","Type":"ContainerDied","Data":"2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4"} Dec 16 07:17:04 crc kubenswrapper[4789]: I1216 07:17:04.476228 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7mwc" event={"ID":"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781","Type":"ContainerStarted","Data":"c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814"} Dec 16 07:17:05 crc kubenswrapper[4789]: I1216 07:17:05.487222 4789 generic.go:334] "Generic (PLEG): container finished" podID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerID="c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814" exitCode=0 Dec 16 07:17:05 crc kubenswrapper[4789]: I1216 07:17:05.487280 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7mwc" event={"ID":"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781","Type":"ContainerDied","Data":"c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814"} Dec 16 07:17:11 crc kubenswrapper[4789]: I1216 07:17:11.537489 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7mwc" event={"ID":"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781","Type":"ContainerStarted","Data":"2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe"} Dec 16 07:17:11 crc kubenswrapper[4789]: I1216 07:17:11.555308 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p7mwc" podStartSLOduration=3.471013304 podStartE2EDuration="10.555289961s" podCreationTimestamp="2025-12-16 07:17:01 +0000 UTC" firstStartedPulling="2025-12-16 07:17:03.470166783 +0000 UTC m=+1561.732054412" lastFinishedPulling="2025-12-16 07:17:10.55444344 +0000 UTC m=+1568.816331069" observedRunningTime="2025-12-16 07:17:11.553021536 +0000 UTC m=+1569.814909185" watchObservedRunningTime="2025-12-16 07:17:11.555289961 +0000 UTC m=+1569.817177590" Dec 16 07:17:21 crc kubenswrapper[4789]: I1216 07:17:21.481988 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:21 crc kubenswrapper[4789]: I1216 07:17:21.482597 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:21 crc kubenswrapper[4789]: I1216 07:17:21.525803 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:21 crc kubenswrapper[4789]: I1216 07:17:21.656523 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:21 crc kubenswrapper[4789]: I1216 07:17:21.756764 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7mwc"] Dec 16 07:17:23 crc kubenswrapper[4789]: I1216 07:17:23.638020 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p7mwc" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="registry-server" containerID="cri-o://2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe" gracePeriod=2 Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.084057 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.184664 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-catalog-content\") pod \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.184796 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-utilities\") pod \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.184930 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtz5\" (UniqueName: \"kubernetes.io/projected/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-kube-api-access-cbtz5\") pod \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\" (UID: \"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781\") " Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.185980 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-utilities" (OuterVolumeSpecName: "utilities") pod "cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" (UID: "cace9bf8-ac3d-4be9-b45e-4d4dd0c99781"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.192180 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-kube-api-access-cbtz5" (OuterVolumeSpecName: "kube-api-access-cbtz5") pod "cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" (UID: "cace9bf8-ac3d-4be9-b45e-4d4dd0c99781"). InnerVolumeSpecName "kube-api-access-cbtz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.207942 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" (UID: "cace9bf8-ac3d-4be9-b45e-4d4dd0c99781"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.286109 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.286138 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtz5\" (UniqueName: \"kubernetes.io/projected/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-kube-api-access-cbtz5\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.286149 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.647932 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7mwc" event={"ID":"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781","Type":"ContainerDied","Data":"2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe"} Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.648267 4789 generic.go:334] "Generic (PLEG): container finished" podID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerID="2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe" exitCode=0 Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.648293 4789 scope.go:117] "RemoveContainer" containerID="2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.648305 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p7mwc" event={"ID":"cace9bf8-ac3d-4be9-b45e-4d4dd0c99781","Type":"ContainerDied","Data":"ee8febeecb0c8bc3347b6e0ba35977bd72aead4e81eeb08b2abbb402c2464ab0"} Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.648095 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p7mwc" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.677714 4789 scope.go:117] "RemoveContainer" containerID="c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.690095 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7mwc"] Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.693153 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p7mwc"] Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.713977 4789 scope.go:117] "RemoveContainer" containerID="2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.736462 4789 scope.go:117] "RemoveContainer" containerID="2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe" Dec 16 07:17:24 crc kubenswrapper[4789]: E1216 07:17:24.736945 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe\": container with ID starting with 2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe not found: ID does not exist" containerID="2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.736986 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe"} err="failed to get container status \"2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe\": rpc error: code = NotFound desc = could not find container \"2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe\": container with ID starting with 2d9578e92ab82dd7bb213de29d0a764b396b4d013db03de359f86fc2bdbcbcbe not found: ID does not exist" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.737031 4789 scope.go:117] "RemoveContainer" containerID="c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814" Dec 16 07:17:24 crc kubenswrapper[4789]: E1216 07:17:24.737282 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814\": container with ID starting with c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814 not found: ID does not exist" containerID="c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.737315 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814"} err="failed to get container status \"c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814\": rpc error: code = NotFound desc = could not find container \"c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814\": container with ID starting with c5df7136f27a315f36d456ca5063f76852328e3d185bc0cde29034b27ed15814 not found: ID does not exist" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.737336 4789 scope.go:117] "RemoveContainer" containerID="2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4" Dec 16 07:17:24 crc kubenswrapper[4789]: E1216 07:17:24.737531 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4\": container with ID starting with 2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4 not found: ID does not exist" containerID="2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4" Dec 16 07:17:24 crc kubenswrapper[4789]: I1216 07:17:24.737577 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4"} err="failed to get container status \"2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4\": rpc error: code = NotFound desc = could not find container \"2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4\": container with ID starting with 2c55f3f61bd857e225fbb3f67de3ab018235229fbd1c4937b493dbe0e597f4f4 not found: ID does not exist" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.112774 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" path="/var/lib/kubelet/pods/cace9bf8-ac3d-4be9-b45e-4d4dd0c99781/volumes" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.144897 4789 scope.go:117] "RemoveContainer" containerID="6b9b43486b394346d4ab06609223e399c69302b8c06ad02c18fe02f7f5d8d2d8" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.167054 4789 scope.go:117] "RemoveContainer" containerID="2968db79927abd4bde0e18043eeec75e605baf4886fb111fa203e77c24e8aa6d" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.191726 4789 scope.go:117] "RemoveContainer" containerID="52cc706873c97fe5550bec5c9f9177edfc302bf236179eb223b3d953638f9aaf" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.220766 4789 scope.go:117] "RemoveContainer" containerID="f7818360dd745ad4d1a1a3f20422491e4c611bad12433d1f211fc60b950a17fb" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.245806 4789 scope.go:117] "RemoveContainer" containerID="3f6b0abba557f232b48bdb5ec4645df3b1b38170d8d6f2440c380ed6a141dd6d" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.269003 4789 scope.go:117] "RemoveContainer" containerID="523184f43f6a87633c2dfba043eb3ad3efe76b7901ca80b628fa924a53fb6f83" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.291944 4789 scope.go:117] "RemoveContainer" containerID="ab861ee372290f4f8fec30aa7a49246b3ed73a01f2cae71985e21abf394dedd1" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.313637 4789 scope.go:117] "RemoveContainer" containerID="1a69f192606363373bec9813a6b1bcee5852eefa076923d229a54ab7daf7d583" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.335457 4789 scope.go:117] "RemoveContainer" containerID="2a59947be685694967d3e633218d3dd126cc65e9f818fd0449696960e2ac3ffd" Dec 16 07:17:26 crc kubenswrapper[4789]: I1216 07:17:26.352297 4789 scope.go:117] "RemoveContainer" containerID="49ce1da469ce6b385a94da8426c33871259532e85a44ed87376c8c1678c0c690" Dec 16 07:18:21 crc kubenswrapper[4789]: I1216 07:18:21.928404 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:18:21 crc kubenswrapper[4789]: I1216 07:18:21.929830 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.590306 4789 scope.go:117] "RemoveContainer" containerID="a4c8b978fe12bb63bc957bf77935159302b7af3ef11194e3a4ad5ee214f0ad07" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.629748 4789 scope.go:117] "RemoveContainer" containerID="ccfba8128cb37296a7ad5efdd726c44e3192d20ed3ebc813659f9a92598d3b42" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.649999 4789 scope.go:117] "RemoveContainer" containerID="c8b668331c6026beabbd783213a89f766a203e33aae00ba9b1d79a7de6730e9f" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.701090 4789 scope.go:117] "RemoveContainer" containerID="4d4f28d29d9e45c9d1a534d1b5655094d706f77fa22f2d906a42b362484f9d25" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.719783 4789 scope.go:117] "RemoveContainer" containerID="38532090bd6f4cace6fd83a68fadf8e82aefac3dad9c257c36af02a4dd1033b5" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.743036 4789 scope.go:117] "RemoveContainer" containerID="532fd11f65282fc52bc292b7aff8cda54f4fc5a01dcbc82485f8e436746b9749" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.775370 4789 scope.go:117] "RemoveContainer" containerID="0b0240b70cddddde386248f7aa63d1008a41744df7cabf3cdb8893cad6a13888" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.811249 4789 scope.go:117] "RemoveContainer" containerID="f41be4360c65f86b476a62a1224e82bdbdfd4163db8a0afd3bb6ffc86b640b24" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.832868 4789 scope.go:117] "RemoveContainer" containerID="3de1d641f5bc055659878f3fb9702aef4d0f671e418ce7cddb37b9a7b2ceb48b" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.857475 4789 scope.go:117] "RemoveContainer" containerID="5006dc4ca327d0dd04cdaac12e74e1592268701a5255897d3cbe17ad6fe5b2c2" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.873346 4789 scope.go:117] "RemoveContainer" containerID="fa4802d6b1f9b4ab40abb3aee482c4a9ba03e2786e17ce87dee90707888b166f" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.887810 4789 scope.go:117] "RemoveContainer" containerID="8db8c91f3588978f5cc5cf7aa8691f106f9c0d392f047d1074534065dd1f409d" Dec 16 07:18:26 crc kubenswrapper[4789]: I1216 07:18:26.908438 4789 scope.go:117] "RemoveContainer" containerID="a4e1972471943947df87b1d22c704377beeab31b729a1c467937dfb3523caf4d" Dec 16 07:18:51 crc kubenswrapper[4789]: I1216 07:18:51.928155 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:18:51 crc kubenswrapper[4789]: I1216 07:18:51.929281 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:19:21 crc kubenswrapper[4789]: I1216 07:19:21.927712 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:19:21 crc kubenswrapper[4789]: I1216 07:19:21.928308 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:19:21 crc kubenswrapper[4789]: I1216 07:19:21.928365 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:19:21 crc kubenswrapper[4789]: I1216 07:19:21.929110 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:19:21 crc kubenswrapper[4789]: I1216 07:19:21.929188 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" gracePeriod=600 Dec 16 07:19:22 crc kubenswrapper[4789]: E1216 07:19:22.052869 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:19:22 crc kubenswrapper[4789]: I1216 07:19:22.537280 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" exitCode=0 Dec 16 07:19:22 crc kubenswrapper[4789]: I1216 07:19:22.537361 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6"} Dec 16 07:19:22 crc kubenswrapper[4789]: I1216 07:19:22.537639 4789 scope.go:117] "RemoveContainer" containerID="c8393873a978af7e8e2aad1167caa21ec29d5fd3e46fb65f45bf1708f741ab20" Dec 16 07:19:22 crc kubenswrapper[4789]: I1216 07:19:22.538138 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:19:22 crc kubenswrapper[4789]: E1216 07:19:22.538391 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:19:27 crc kubenswrapper[4789]: I1216 07:19:27.067704 4789 scope.go:117] "RemoveContainer" containerID="d35da1999fd3dc35eaf2c9bde171ffdbd72680cf0a247a9b921b35018bb0d859" Dec 16 07:19:27 crc kubenswrapper[4789]: I1216 07:19:27.117273 4789 scope.go:117] "RemoveContainer" containerID="6817790e0d358dcc4395f0de504a86d8b3b7db41eb13bb80735e722842fce735" Dec 16 07:19:35 crc kubenswrapper[4789]: I1216 07:19:35.107698 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:19:35 crc kubenswrapper[4789]: E1216 07:19:35.108501 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:19:48 crc kubenswrapper[4789]: I1216 07:19:48.105120 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:19:48 crc kubenswrapper[4789]: E1216 07:19:48.105743 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:20:03 crc kubenswrapper[4789]: I1216 07:20:03.105691 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:20:03 crc kubenswrapper[4789]: E1216 07:20:03.106335 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:20:17 crc kubenswrapper[4789]: I1216 07:20:17.104701 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:20:17 crc kubenswrapper[4789]: E1216 07:20:17.105192 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:20:27 crc kubenswrapper[4789]: I1216 07:20:27.189863 4789 scope.go:117] "RemoveContainer" containerID="0b13c83f5ec120dc4feea18f65f1c3f02935a2d38f0813d1ecc61531e9b5b8f2" Dec 16 07:20:27 crc kubenswrapper[4789]: I1216 07:20:27.232138 4789 scope.go:117] "RemoveContainer" containerID="3549bc0cc1314556e102d5bb9c5e370800c3e9c50b9cb85c48387cd83a095e2c" Dec 16 07:20:27 crc kubenswrapper[4789]: I1216 07:20:27.249896 4789 scope.go:117] "RemoveContainer" containerID="5507caaa355e89c063f63a2cefa09e35c12ab6eb6ce0b04a31fa77618d0f85bd" Dec 16 07:20:27 crc kubenswrapper[4789]: I1216 07:20:27.276362 4789 scope.go:117] "RemoveContainer" containerID="2d1bbeab0e372abd0b616ae4a4940235c89a48f444bb64d8b567503ce48488cb" Dec 16 07:20:27 crc kubenswrapper[4789]: I1216 07:20:27.301050 4789 scope.go:117] "RemoveContainer" containerID="f04069955550582d7569e8c1e11f96772b8e1ef3da29e68d4a5c4e6db554a44f" Dec 16 07:20:27 crc kubenswrapper[4789]: I1216 07:20:27.343451 4789 scope.go:117] "RemoveContainer" containerID="23e575aa3f3085bd2b26af1b9af05f6c37b3a79d95b825164e72059a6443cf01" Dec 16 07:20:27 crc kubenswrapper[4789]: I1216 07:20:27.369070 4789 scope.go:117] "RemoveContainer" containerID="81df070f3270ac471d25411419bf933d40f0a828bdcac35dc7081c9758497a34" Dec 16 07:20:29 crc kubenswrapper[4789]: I1216 07:20:29.104990 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:20:29 crc kubenswrapper[4789]: E1216 07:20:29.105566 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:20:43 crc kubenswrapper[4789]: I1216 07:20:43.105410 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:20:43 crc kubenswrapper[4789]: E1216 07:20:43.106162 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:20:55 crc kubenswrapper[4789]: I1216 07:20:55.105231 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:20:55 crc kubenswrapper[4789]: E1216 07:20:55.105980 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:21:07 crc kubenswrapper[4789]: I1216 07:21:07.104869 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:21:07 crc kubenswrapper[4789]: E1216 07:21:07.105601 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:21:19 crc kubenswrapper[4789]: I1216 07:21:19.105071 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:21:19 crc kubenswrapper[4789]: E1216 07:21:19.105664 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:21:27 crc kubenswrapper[4789]: I1216 07:21:27.483025 4789 scope.go:117] "RemoveContainer" containerID="94400587bfb136f7e982ac2bdabf638fd15508e132e3af50147661d7ccefa7f1" Dec 16 07:21:27 crc kubenswrapper[4789]: I1216 07:21:27.524384 4789 scope.go:117] "RemoveContainer" containerID="ea367b9824a152f5e9f0d2ff9d6db99fdb1d7a70ba0b2b4e1be6b907ac9c6eb4" Dec 16 07:21:27 crc kubenswrapper[4789]: I1216 07:21:27.564927 4789 scope.go:117] "RemoveContainer" containerID="5363a9b917a24482d684b0616aeb1c17a20bcb0aac7b8fa2460fcb0158ec6b7d" Dec 16 07:21:34 crc kubenswrapper[4789]: I1216 07:21:34.105597 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:21:34 crc kubenswrapper[4789]: E1216 07:21:34.106277 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.641303 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nfwtf"] Dec 16 07:21:38 crc kubenswrapper[4789]: E1216 07:21:38.642072 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="registry-server" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.642090 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="registry-server" Dec 16 07:21:38 crc kubenswrapper[4789]: E1216 07:21:38.642113 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="extract-content" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.642122 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="extract-content" Dec 16 07:21:38 crc kubenswrapper[4789]: E1216 07:21:38.642141 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="extract-utilities" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.642149 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="extract-utilities" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.642322 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cace9bf8-ac3d-4be9-b45e-4d4dd0c99781" containerName="registry-server" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.643527 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.655542 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfwtf"] Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.726007 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-catalog-content\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.726127 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7r5s\" (UniqueName: \"kubernetes.io/projected/a0e72b04-9f6e-4171-9c1b-279993292cd4-kube-api-access-x7r5s\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.726360 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-utilities\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.827869 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-utilities\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.827955 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-catalog-content\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.828009 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7r5s\" (UniqueName: \"kubernetes.io/projected/a0e72b04-9f6e-4171-9c1b-279993292cd4-kube-api-access-x7r5s\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.828411 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-catalog-content\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.828492 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-utilities\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.847805 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7r5s\" (UniqueName: \"kubernetes.io/projected/a0e72b04-9f6e-4171-9c1b-279993292cd4-kube-api-access-x7r5s\") pod \"certified-operators-nfwtf\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:38 crc kubenswrapper[4789]: I1216 07:21:38.964234 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:39 crc kubenswrapper[4789]: I1216 07:21:39.446392 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfwtf"] Dec 16 07:21:39 crc kubenswrapper[4789]: I1216 07:21:39.476983 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfwtf" event={"ID":"a0e72b04-9f6e-4171-9c1b-279993292cd4","Type":"ContainerStarted","Data":"931c2e6ba34bfe44ec815ffccd097eabde1ac0ffa284ac7e83524b45c5f8f36d"} Dec 16 07:21:40 crc kubenswrapper[4789]: I1216 07:21:40.485372 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerID="a1631777629bde5c6400d8d232e10cfef4cd05ca892f82a0b79fbf00e88419ed" exitCode=0 Dec 16 07:21:40 crc kubenswrapper[4789]: I1216 07:21:40.485644 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfwtf" event={"ID":"a0e72b04-9f6e-4171-9c1b-279993292cd4","Type":"ContainerDied","Data":"a1631777629bde5c6400d8d232e10cfef4cd05ca892f82a0b79fbf00e88419ed"} Dec 16 07:21:40 crc kubenswrapper[4789]: I1216 07:21:40.487782 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.046628 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n228q"] Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.048507 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.055540 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n228q"] Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.162783 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7867\" (UniqueName: \"kubernetes.io/projected/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-kube-api-access-d7867\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.162854 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-utilities\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.162941 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-catalog-content\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.263903 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7867\" (UniqueName: \"kubernetes.io/projected/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-kube-api-access-d7867\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.263960 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-utilities\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.264002 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-catalog-content\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.264521 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-catalog-content\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.264574 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-utilities\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.292066 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7867\" (UniqueName: \"kubernetes.io/projected/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-kube-api-access-d7867\") pod \"redhat-operators-n228q\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.368310 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:41 crc kubenswrapper[4789]: I1216 07:21:41.815387 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n228q"] Dec 16 07:21:42 crc kubenswrapper[4789]: I1216 07:21:42.499077 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerID="ea0a89eb9250a7655bab836e1122372374e312f47780ede9726bb1541a2b37ba" exitCode=0 Dec 16 07:21:42 crc kubenswrapper[4789]: I1216 07:21:42.499169 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfwtf" event={"ID":"a0e72b04-9f6e-4171-9c1b-279993292cd4","Type":"ContainerDied","Data":"ea0a89eb9250a7655bab836e1122372374e312f47780ede9726bb1541a2b37ba"} Dec 16 07:21:42 crc kubenswrapper[4789]: I1216 07:21:42.500811 4789 generic.go:334] "Generic (PLEG): container finished" podID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerID="6f4b653daecba5c6648142a34a0a08eae3ffa19f889ea223111e7d32b3379899" exitCode=0 Dec 16 07:21:42 crc kubenswrapper[4789]: I1216 07:21:42.500836 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n228q" event={"ID":"67265e1e-45f8-4f97-9eab-8fb2e4ba207f","Type":"ContainerDied","Data":"6f4b653daecba5c6648142a34a0a08eae3ffa19f889ea223111e7d32b3379899"} Dec 16 07:21:42 crc kubenswrapper[4789]: I1216 07:21:42.500851 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n228q" event={"ID":"67265e1e-45f8-4f97-9eab-8fb2e4ba207f","Type":"ContainerStarted","Data":"908e753b559414e29355a75c904ca72816598f46d9a0d5f9c7edd94103ec25a5"} Dec 16 07:21:43 crc kubenswrapper[4789]: I1216 07:21:43.509605 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n228q" event={"ID":"67265e1e-45f8-4f97-9eab-8fb2e4ba207f","Type":"ContainerStarted","Data":"087e29d174d1ba82829ffa8ac99ca9803db2b34ea0619d168f98a0b371ad4210"} Dec 16 07:21:43 crc kubenswrapper[4789]: I1216 07:21:43.512420 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfwtf" event={"ID":"a0e72b04-9f6e-4171-9c1b-279993292cd4","Type":"ContainerStarted","Data":"44f8041d5fddd712117eec44c7bdd10c0fa6fc2934d89f7a76b8206570e16b02"} Dec 16 07:21:43 crc kubenswrapper[4789]: I1216 07:21:43.547743 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nfwtf" podStartSLOduration=3.057139075 podStartE2EDuration="5.547723622s" podCreationTimestamp="2025-12-16 07:21:38 +0000 UTC" firstStartedPulling="2025-12-16 07:21:40.487555174 +0000 UTC m=+1838.749442803" lastFinishedPulling="2025-12-16 07:21:42.978139721 +0000 UTC m=+1841.240027350" observedRunningTime="2025-12-16 07:21:43.543487789 +0000 UTC m=+1841.805375418" watchObservedRunningTime="2025-12-16 07:21:43.547723622 +0000 UTC m=+1841.809611251" Dec 16 07:21:44 crc kubenswrapper[4789]: I1216 07:21:44.519758 4789 generic.go:334] "Generic (PLEG): container finished" podID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerID="087e29d174d1ba82829ffa8ac99ca9803db2b34ea0619d168f98a0b371ad4210" exitCode=0 Dec 16 07:21:44 crc kubenswrapper[4789]: I1216 07:21:44.521236 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n228q" event={"ID":"67265e1e-45f8-4f97-9eab-8fb2e4ba207f","Type":"ContainerDied","Data":"087e29d174d1ba82829ffa8ac99ca9803db2b34ea0619d168f98a0b371ad4210"} Dec 16 07:21:45 crc kubenswrapper[4789]: I1216 07:21:45.530661 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n228q" event={"ID":"67265e1e-45f8-4f97-9eab-8fb2e4ba207f","Type":"ContainerStarted","Data":"ed02d5041f00ac4a79ef15f3d2cd9c367a5609ee8e8bf19da0893d9925a971b5"} Dec 16 07:21:45 crc kubenswrapper[4789]: I1216 07:21:45.553112 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n228q" podStartSLOduration=1.896854998 podStartE2EDuration="4.553092536s" podCreationTimestamp="2025-12-16 07:21:41 +0000 UTC" firstStartedPulling="2025-12-16 07:21:42.501851846 +0000 UTC m=+1840.763739475" lastFinishedPulling="2025-12-16 07:21:45.158089384 +0000 UTC m=+1843.419977013" observedRunningTime="2025-12-16 07:21:45.548882804 +0000 UTC m=+1843.810770443" watchObservedRunningTime="2025-12-16 07:21:45.553092536 +0000 UTC m=+1843.814980165" Dec 16 07:21:47 crc kubenswrapper[4789]: I1216 07:21:47.105031 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:21:47 crc kubenswrapper[4789]: E1216 07:21:47.105270 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:21:48 crc kubenswrapper[4789]: I1216 07:21:48.964934 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:48 crc kubenswrapper[4789]: I1216 07:21:48.965260 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:49 crc kubenswrapper[4789]: I1216 07:21:49.011074 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:49 crc kubenswrapper[4789]: I1216 07:21:49.595273 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:50 crc kubenswrapper[4789]: I1216 07:21:50.432838 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfwtf"] Dec 16 07:21:51 crc kubenswrapper[4789]: I1216 07:21:51.368660 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:51 crc kubenswrapper[4789]: I1216 07:21:51.368751 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:51 crc kubenswrapper[4789]: I1216 07:21:51.408742 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:51 crc kubenswrapper[4789]: I1216 07:21:51.569815 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nfwtf" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="registry-server" containerID="cri-o://44f8041d5fddd712117eec44c7bdd10c0fa6fc2934d89f7a76b8206570e16b02" gracePeriod=2 Dec 16 07:21:51 crc kubenswrapper[4789]: I1216 07:21:51.609291 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:52 crc kubenswrapper[4789]: I1216 07:21:52.833093 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n228q"] Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.585609 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerID="44f8041d5fddd712117eec44c7bdd10c0fa6fc2934d89f7a76b8206570e16b02" exitCode=0 Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.585689 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfwtf" event={"ID":"a0e72b04-9f6e-4171-9c1b-279993292cd4","Type":"ContainerDied","Data":"44f8041d5fddd712117eec44c7bdd10c0fa6fc2934d89f7a76b8206570e16b02"} Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.586107 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n228q" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="registry-server" containerID="cri-o://ed02d5041f00ac4a79ef15f3d2cd9c367a5609ee8e8bf19da0893d9925a971b5" gracePeriod=2 Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.845007 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.858544 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7r5s\" (UniqueName: \"kubernetes.io/projected/a0e72b04-9f6e-4171-9c1b-279993292cd4-kube-api-access-x7r5s\") pod \"a0e72b04-9f6e-4171-9c1b-279993292cd4\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.858659 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-utilities\") pod \"a0e72b04-9f6e-4171-9c1b-279993292cd4\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.858733 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-catalog-content\") pod \"a0e72b04-9f6e-4171-9c1b-279993292cd4\" (UID: \"a0e72b04-9f6e-4171-9c1b-279993292cd4\") " Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.862537 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-utilities" (OuterVolumeSpecName: "utilities") pod "a0e72b04-9f6e-4171-9c1b-279993292cd4" (UID: "a0e72b04-9f6e-4171-9c1b-279993292cd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.867880 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e72b04-9f6e-4171-9c1b-279993292cd4-kube-api-access-x7r5s" (OuterVolumeSpecName: "kube-api-access-x7r5s") pod "a0e72b04-9f6e-4171-9c1b-279993292cd4" (UID: "a0e72b04-9f6e-4171-9c1b-279993292cd4"). InnerVolumeSpecName "kube-api-access-x7r5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.924440 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0e72b04-9f6e-4171-9c1b-279993292cd4" (UID: "a0e72b04-9f6e-4171-9c1b-279993292cd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.960741 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7r5s\" (UniqueName: \"kubernetes.io/projected/a0e72b04-9f6e-4171-9c1b-279993292cd4-kube-api-access-x7r5s\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.960773 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:53 crc kubenswrapper[4789]: I1216 07:21:53.960782 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e72b04-9f6e-4171-9c1b-279993292cd4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.601564 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfwtf" event={"ID":"a0e72b04-9f6e-4171-9c1b-279993292cd4","Type":"ContainerDied","Data":"931c2e6ba34bfe44ec815ffccd097eabde1ac0ffa284ac7e83524b45c5f8f36d"} Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.601943 4789 scope.go:117] "RemoveContainer" containerID="44f8041d5fddd712117eec44c7bdd10c0fa6fc2934d89f7a76b8206570e16b02" Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.602079 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfwtf" Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.607477 4789 generic.go:334] "Generic (PLEG): container finished" podID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerID="ed02d5041f00ac4a79ef15f3d2cd9c367a5609ee8e8bf19da0893d9925a971b5" exitCode=0 Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.607505 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n228q" event={"ID":"67265e1e-45f8-4f97-9eab-8fb2e4ba207f","Type":"ContainerDied","Data":"ed02d5041f00ac4a79ef15f3d2cd9c367a5609ee8e8bf19da0893d9925a971b5"} Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.627089 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfwtf"] Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.630138 4789 scope.go:117] "RemoveContainer" containerID="ea0a89eb9250a7655bab836e1122372374e312f47780ede9726bb1541a2b37ba" Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.632084 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nfwtf"] Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.652541 4789 scope.go:117] "RemoveContainer" containerID="a1631777629bde5c6400d8d232e10cfef4cd05ca892f82a0b79fbf00e88419ed" Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.796783 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.970942 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-utilities\") pod \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.971062 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7867\" (UniqueName: \"kubernetes.io/projected/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-kube-api-access-d7867\") pod \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.971122 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-catalog-content\") pod \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\" (UID: \"67265e1e-45f8-4f97-9eab-8fb2e4ba207f\") " Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.972112 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-utilities" (OuterVolumeSpecName: "utilities") pod "67265e1e-45f8-4f97-9eab-8fb2e4ba207f" (UID: "67265e1e-45f8-4f97-9eab-8fb2e4ba207f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:54 crc kubenswrapper[4789]: I1216 07:21:54.975876 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-kube-api-access-d7867" (OuterVolumeSpecName: "kube-api-access-d7867") pod "67265e1e-45f8-4f97-9eab-8fb2e4ba207f" (UID: "67265e1e-45f8-4f97-9eab-8fb2e4ba207f"). InnerVolumeSpecName "kube-api-access-d7867". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.072578 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7867\" (UniqueName: \"kubernetes.io/projected/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-kube-api-access-d7867\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.072617 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.089337 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67265e1e-45f8-4f97-9eab-8fb2e4ba207f" (UID: "67265e1e-45f8-4f97-9eab-8fb2e4ba207f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.173688 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67265e1e-45f8-4f97-9eab-8fb2e4ba207f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.618422 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n228q" event={"ID":"67265e1e-45f8-4f97-9eab-8fb2e4ba207f","Type":"ContainerDied","Data":"908e753b559414e29355a75c904ca72816598f46d9a0d5f9c7edd94103ec25a5"} Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.618461 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n228q" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.618473 4789 scope.go:117] "RemoveContainer" containerID="ed02d5041f00ac4a79ef15f3d2cd9c367a5609ee8e8bf19da0893d9925a971b5" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.639696 4789 scope.go:117] "RemoveContainer" containerID="087e29d174d1ba82829ffa8ac99ca9803db2b34ea0619d168f98a0b371ad4210" Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.662887 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n228q"] Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.667987 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n228q"] Dec 16 07:21:55 crc kubenswrapper[4789]: I1216 07:21:55.668131 4789 scope.go:117] "RemoveContainer" containerID="6f4b653daecba5c6648142a34a0a08eae3ffa19f889ea223111e7d32b3379899" Dec 16 07:21:56 crc kubenswrapper[4789]: I1216 07:21:56.113752 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" path="/var/lib/kubelet/pods/67265e1e-45f8-4f97-9eab-8fb2e4ba207f/volumes" Dec 16 07:21:56 crc kubenswrapper[4789]: I1216 07:21:56.115263 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" path="/var/lib/kubelet/pods/a0e72b04-9f6e-4171-9c1b-279993292cd4/volumes" Dec 16 07:22:02 crc kubenswrapper[4789]: I1216 07:22:02.110805 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:22:02 crc kubenswrapper[4789]: E1216 07:22:02.111651 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:22:13 crc kubenswrapper[4789]: I1216 07:22:13.107306 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:22:13 crc kubenswrapper[4789]: E1216 07:22:13.108226 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:22:27 crc kubenswrapper[4789]: I1216 07:22:27.652092 4789 scope.go:117] "RemoveContainer" containerID="d60353fa5ba849ea6e4e0c3d7ef76b89ce84019c3a133410894721c746ae9ea9" Dec 16 07:22:27 crc kubenswrapper[4789]: I1216 07:22:27.673268 4789 scope.go:117] "RemoveContainer" containerID="8bbc52e42bff37f691a5171e008b43120b3eb76895f170e5be9ef714237b885f" Dec 16 07:22:27 crc kubenswrapper[4789]: I1216 07:22:27.694807 4789 scope.go:117] "RemoveContainer" containerID="219fbc2ee038a6ec79a5c2ed6cf3962635b1244065178fb8a4e9bde8190c278e" Dec 16 07:22:27 crc kubenswrapper[4789]: I1216 07:22:27.715664 4789 scope.go:117] "RemoveContainer" containerID="94f733234febab44be66c3a6362c59850bc8bebe5864b744fd69d4b571a3292f" Dec 16 07:22:27 crc kubenswrapper[4789]: I1216 07:22:27.743441 4789 scope.go:117] "RemoveContainer" containerID="e1eca940464b8ae3b391be41652790cb6e58d7ba7d126643ddbe8f964dc950c4" Dec 16 07:22:28 crc kubenswrapper[4789]: I1216 07:22:28.105713 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:22:28 crc kubenswrapper[4789]: E1216 07:22:28.105950 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:22:39 crc kubenswrapper[4789]: I1216 07:22:39.105173 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:22:39 crc kubenswrapper[4789]: E1216 07:22:39.105907 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:22:54 crc kubenswrapper[4789]: I1216 07:22:54.105439 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:22:54 crc kubenswrapper[4789]: E1216 07:22:54.106059 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:23:05 crc kubenswrapper[4789]: I1216 07:23:05.105104 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:23:05 crc kubenswrapper[4789]: E1216 07:23:05.106201 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:23:20 crc kubenswrapper[4789]: I1216 07:23:20.105012 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:23:20 crc kubenswrapper[4789]: E1216 07:23:20.105760 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:23:34 crc kubenswrapper[4789]: I1216 07:23:34.105220 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:23:34 crc kubenswrapper[4789]: E1216 07:23:34.106295 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:23:46 crc kubenswrapper[4789]: I1216 07:23:46.106095 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:23:46 crc kubenswrapper[4789]: E1216 07:23:46.107139 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:23:59 crc kubenswrapper[4789]: I1216 07:23:59.105003 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:23:59 crc kubenswrapper[4789]: E1216 07:23:59.105857 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:24:13 crc kubenswrapper[4789]: I1216 07:24:13.105205 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:24:13 crc kubenswrapper[4789]: E1216 07:24:13.106164 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:24:28 crc kubenswrapper[4789]: I1216 07:24:28.105342 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:24:28 crc kubenswrapper[4789]: I1216 07:24:28.924204 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"a9bab3fc5a6efc59a48aae9a5179bf05df99f772dee59fd9fcec844e562d2c74"} Dec 16 07:26:51 crc kubenswrapper[4789]: I1216 07:26:51.927670 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:26:51 crc kubenswrapper[4789]: I1216 07:26:51.928243 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:27:21 crc kubenswrapper[4789]: I1216 07:27:21.928377 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:27:21 crc kubenswrapper[4789]: I1216 07:27:21.929030 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.590464 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2lwrk"] Dec 16 07:27:32 crc kubenswrapper[4789]: E1216 07:27:32.591314 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="registry-server" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591326 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="registry-server" Dec 16 07:27:32 crc kubenswrapper[4789]: E1216 07:27:32.591340 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="extract-utilities" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591346 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="extract-utilities" Dec 16 07:27:32 crc kubenswrapper[4789]: E1216 07:27:32.591364 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="extract-content" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591370 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="extract-content" Dec 16 07:27:32 crc kubenswrapper[4789]: E1216 07:27:32.591379 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="registry-server" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591385 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="registry-server" Dec 16 07:27:32 crc kubenswrapper[4789]: E1216 07:27:32.591397 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="extract-content" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591403 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="extract-content" Dec 16 07:27:32 crc kubenswrapper[4789]: E1216 07:27:32.591411 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="extract-utilities" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591417 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="extract-utilities" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591541 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="67265e1e-45f8-4f97-9eab-8fb2e4ba207f" containerName="registry-server" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.591550 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e72b04-9f6e-4171-9c1b-279993292cd4" containerName="registry-server" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.592446 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.604145 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lwrk"] Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.699972 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7339aad-951b-4f0f-8868-44e1d98f5871-utilities\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.700010 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7339aad-951b-4f0f-8868-44e1d98f5871-catalog-content\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.700054 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/c7339aad-951b-4f0f-8868-44e1d98f5871-kube-api-access-ff42h\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.801135 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7339aad-951b-4f0f-8868-44e1d98f5871-utilities\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.801178 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7339aad-951b-4f0f-8868-44e1d98f5871-catalog-content\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.801221 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/c7339aad-951b-4f0f-8868-44e1d98f5871-kube-api-access-ff42h\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.801667 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7339aad-951b-4f0f-8868-44e1d98f5871-utilities\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.802186 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7339aad-951b-4f0f-8868-44e1d98f5871-catalog-content\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.824242 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/c7339aad-951b-4f0f-8868-44e1d98f5871-kube-api-access-ff42h\") pod \"community-operators-2lwrk\" (UID: \"c7339aad-951b-4f0f-8868-44e1d98f5871\") " pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:32 crc kubenswrapper[4789]: I1216 07:27:32.919110 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:33 crc kubenswrapper[4789]: I1216 07:27:33.368528 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lwrk"] Dec 16 07:27:34 crc kubenswrapper[4789]: I1216 07:27:34.224802 4789 generic.go:334] "Generic (PLEG): container finished" podID="c7339aad-951b-4f0f-8868-44e1d98f5871" containerID="794b9d541305a43e44bf0c587f12ccdaf7ad982e74c2aaf5c4d6484cf2258186" exitCode=0 Dec 16 07:27:34 crc kubenswrapper[4789]: I1216 07:27:34.224877 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lwrk" event={"ID":"c7339aad-951b-4f0f-8868-44e1d98f5871","Type":"ContainerDied","Data":"794b9d541305a43e44bf0c587f12ccdaf7ad982e74c2aaf5c4d6484cf2258186"} Dec 16 07:27:34 crc kubenswrapper[4789]: I1216 07:27:34.224970 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lwrk" event={"ID":"c7339aad-951b-4f0f-8868-44e1d98f5871","Type":"ContainerStarted","Data":"9702c8c3d30381e2c1c4ab4c60872a1f191613ddcf54715aa01eb4f76ea55968"} Dec 16 07:27:34 crc kubenswrapper[4789]: I1216 07:27:34.230371 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:27:38 crc kubenswrapper[4789]: I1216 07:27:38.248484 4789 generic.go:334] "Generic (PLEG): container finished" podID="c7339aad-951b-4f0f-8868-44e1d98f5871" containerID="954bb79f22b1319b1e493338dcda0d5ccbe7d62c121435498561572ffda4a6ee" exitCode=0 Dec 16 07:27:38 crc kubenswrapper[4789]: I1216 07:27:38.248590 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lwrk" event={"ID":"c7339aad-951b-4f0f-8868-44e1d98f5871","Type":"ContainerDied","Data":"954bb79f22b1319b1e493338dcda0d5ccbe7d62c121435498561572ffda4a6ee"} Dec 16 07:27:39 crc kubenswrapper[4789]: I1216 07:27:39.255976 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lwrk" event={"ID":"c7339aad-951b-4f0f-8868-44e1d98f5871","Type":"ContainerStarted","Data":"a20c8ab896fe0a530d7d8c947872e725eb2ea477b5b69646b9cae2f0c2fda14e"} Dec 16 07:27:39 crc kubenswrapper[4789]: I1216 07:27:39.271478 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2lwrk" podStartSLOduration=2.777000464 podStartE2EDuration="7.271458115s" podCreationTimestamp="2025-12-16 07:27:32 +0000 UTC" firstStartedPulling="2025-12-16 07:27:34.230178826 +0000 UTC m=+2192.492066455" lastFinishedPulling="2025-12-16 07:27:38.724636487 +0000 UTC m=+2196.986524106" observedRunningTime="2025-12-16 07:27:39.27000819 +0000 UTC m=+2197.531895829" watchObservedRunningTime="2025-12-16 07:27:39.271458115 +0000 UTC m=+2197.533345764" Dec 16 07:27:42 crc kubenswrapper[4789]: I1216 07:27:42.919375 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:42 crc kubenswrapper[4789]: I1216 07:27:42.920136 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:42 crc kubenswrapper[4789]: I1216 07:27:42.962740 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:43 crc kubenswrapper[4789]: I1216 07:27:43.319775 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2lwrk" Dec 16 07:27:43 crc kubenswrapper[4789]: I1216 07:27:43.389374 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lwrk"] Dec 16 07:27:43 crc kubenswrapper[4789]: I1216 07:27:43.431592 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2jhz"] Dec 16 07:27:43 crc kubenswrapper[4789]: I1216 07:27:43.432018 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2jhz" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="registry-server" containerID="cri-o://08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c" gracePeriod=2 Dec 16 07:27:44 crc kubenswrapper[4789]: I1216 07:27:44.295862 4789 generic.go:334] "Generic (PLEG): container finished" podID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerID="08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c" exitCode=0 Dec 16 07:27:44 crc kubenswrapper[4789]: I1216 07:27:44.295968 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2jhz" event={"ID":"63c09acc-7922-4c06-b5ee-74d87e7e9d80","Type":"ContainerDied","Data":"08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c"} Dec 16 07:27:44 crc kubenswrapper[4789]: E1216 07:27:44.533740 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c is running failed: container process not found" containerID="08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:27:44 crc kubenswrapper[4789]: E1216 07:27:44.534245 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c is running failed: container process not found" containerID="08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:27:44 crc kubenswrapper[4789]: E1216 07:27:44.534571 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c is running failed: container process not found" containerID="08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 07:27:44 crc kubenswrapper[4789]: E1216 07:27:44.534610 4789 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-l2jhz" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="registry-server" Dec 16 07:27:44 crc kubenswrapper[4789]: I1216 07:27:44.923781 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.069680 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-utilities\") pod \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.069782 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7clzt\" (UniqueName: \"kubernetes.io/projected/63c09acc-7922-4c06-b5ee-74d87e7e9d80-kube-api-access-7clzt\") pod \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.069816 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-catalog-content\") pod \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\" (UID: \"63c09acc-7922-4c06-b5ee-74d87e7e9d80\") " Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.070521 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-utilities" (OuterVolumeSpecName: "utilities") pod "63c09acc-7922-4c06-b5ee-74d87e7e9d80" (UID: "63c09acc-7922-4c06-b5ee-74d87e7e9d80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.076047 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c09acc-7922-4c06-b5ee-74d87e7e9d80-kube-api-access-7clzt" (OuterVolumeSpecName: "kube-api-access-7clzt") pod "63c09acc-7922-4c06-b5ee-74d87e7e9d80" (UID: "63c09acc-7922-4c06-b5ee-74d87e7e9d80"). InnerVolumeSpecName "kube-api-access-7clzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.116126 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63c09acc-7922-4c06-b5ee-74d87e7e9d80" (UID: "63c09acc-7922-4c06-b5ee-74d87e7e9d80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.171140 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.171174 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7clzt\" (UniqueName: \"kubernetes.io/projected/63c09acc-7922-4c06-b5ee-74d87e7e9d80-kube-api-access-7clzt\") on node \"crc\" DevicePath \"\"" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.171189 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63c09acc-7922-4c06-b5ee-74d87e7e9d80-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.305909 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2jhz" event={"ID":"63c09acc-7922-4c06-b5ee-74d87e7e9d80","Type":"ContainerDied","Data":"db487e877d244460fe3474349185646a0653dcf460d80bac31e4b193dd7b29bf"} Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.306049 4789 scope.go:117] "RemoveContainer" containerID="08ec148d351b6b03a60591ff8ef5f87eb51a91a2697ac0d543e609451247536c" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.305981 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2jhz" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.337666 4789 scope.go:117] "RemoveContainer" containerID="516bf647bdef8ec0cb28524c73ee2f42321aced10951d7b193b997b0769973a9" Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.343132 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2jhz"] Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.347762 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2jhz"] Dec 16 07:27:45 crc kubenswrapper[4789]: I1216 07:27:45.357149 4789 scope.go:117] "RemoveContainer" containerID="9a07630bd2effed51a4bb91caa8ca12366a39692f665d9c41c191227e4e27799" Dec 16 07:27:46 crc kubenswrapper[4789]: I1216 07:27:46.121012 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" path="/var/lib/kubelet/pods/63c09acc-7922-4c06-b5ee-74d87e7e9d80/volumes" Dec 16 07:27:51 crc kubenswrapper[4789]: I1216 07:27:51.927530 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:27:51 crc kubenswrapper[4789]: I1216 07:27:51.927795 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:27:51 crc kubenswrapper[4789]: I1216 07:27:51.927837 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:27:51 crc kubenswrapper[4789]: I1216 07:27:51.928385 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9bab3fc5a6efc59a48aae9a5179bf05df99f772dee59fd9fcec844e562d2c74"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:27:51 crc kubenswrapper[4789]: I1216 07:27:51.928429 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://a9bab3fc5a6efc59a48aae9a5179bf05df99f772dee59fd9fcec844e562d2c74" gracePeriod=600 Dec 16 07:27:52 crc kubenswrapper[4789]: I1216 07:27:52.362796 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="a9bab3fc5a6efc59a48aae9a5179bf05df99f772dee59fd9fcec844e562d2c74" exitCode=0 Dec 16 07:27:52 crc kubenswrapper[4789]: I1216 07:27:52.362878 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"a9bab3fc5a6efc59a48aae9a5179bf05df99f772dee59fd9fcec844e562d2c74"} Dec 16 07:27:52 crc kubenswrapper[4789]: I1216 07:27:52.363548 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff"} Dec 16 07:27:52 crc kubenswrapper[4789]: I1216 07:27:52.363712 4789 scope.go:117] "RemoveContainer" containerID="5daed310edddf31ee0069f8e44599472f863fe81797433aaed0c81d99e6df8c6" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.387993 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cxmwx"] Dec 16 07:27:56 crc kubenswrapper[4789]: E1216 07:27:56.389576 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="registry-server" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.389596 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="registry-server" Dec 16 07:27:56 crc kubenswrapper[4789]: E1216 07:27:56.389608 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="extract-utilities" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.389615 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="extract-utilities" Dec 16 07:27:56 crc kubenswrapper[4789]: E1216 07:27:56.389627 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="extract-content" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.389633 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="extract-content" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.389793 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c09acc-7922-4c06-b5ee-74d87e7e9d80" containerName="registry-server" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.391002 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.410341 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxmwx"] Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.447272 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-catalog-content\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.447315 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flsv4\" (UniqueName: \"kubernetes.io/projected/4d1e8b75-2021-485e-a26b-4a45286fb421-kube-api-access-flsv4\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.447395 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-utilities\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.549103 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-catalog-content\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.549225 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flsv4\" (UniqueName: \"kubernetes.io/projected/4d1e8b75-2021-485e-a26b-4a45286fb421-kube-api-access-flsv4\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.549419 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-utilities\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.549663 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-catalog-content\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.549804 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-utilities\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.568569 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flsv4\" (UniqueName: \"kubernetes.io/projected/4d1e8b75-2021-485e-a26b-4a45286fb421-kube-api-access-flsv4\") pod \"redhat-marketplace-cxmwx\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:56 crc kubenswrapper[4789]: I1216 07:27:56.745788 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:27:57 crc kubenswrapper[4789]: I1216 07:27:57.184466 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxmwx"] Dec 16 07:27:57 crc kubenswrapper[4789]: I1216 07:27:57.410852 4789 generic.go:334] "Generic (PLEG): container finished" podID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerID="ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318" exitCode=0 Dec 16 07:27:57 crc kubenswrapper[4789]: I1216 07:27:57.410926 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxmwx" event={"ID":"4d1e8b75-2021-485e-a26b-4a45286fb421","Type":"ContainerDied","Data":"ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318"} Dec 16 07:27:57 crc kubenswrapper[4789]: I1216 07:27:57.411186 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxmwx" event={"ID":"4d1e8b75-2021-485e-a26b-4a45286fb421","Type":"ContainerStarted","Data":"0a6744628d257b92b75be27fe2cd9abfa28eaf4a2148ef0e2f30dae990675e6b"} Dec 16 07:27:58 crc kubenswrapper[4789]: I1216 07:27:58.419733 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxmwx" event={"ID":"4d1e8b75-2021-485e-a26b-4a45286fb421","Type":"ContainerStarted","Data":"9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe"} Dec 16 07:27:59 crc kubenswrapper[4789]: I1216 07:27:59.430035 4789 generic.go:334] "Generic (PLEG): container finished" podID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerID="9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe" exitCode=0 Dec 16 07:27:59 crc kubenswrapper[4789]: I1216 07:27:59.430100 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxmwx" event={"ID":"4d1e8b75-2021-485e-a26b-4a45286fb421","Type":"ContainerDied","Data":"9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe"} Dec 16 07:28:00 crc kubenswrapper[4789]: I1216 07:28:00.439516 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxmwx" event={"ID":"4d1e8b75-2021-485e-a26b-4a45286fb421","Type":"ContainerStarted","Data":"41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640"} Dec 16 07:28:00 crc kubenswrapper[4789]: I1216 07:28:00.465039 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cxmwx" podStartSLOduration=2.013161076 podStartE2EDuration="4.465015256s" podCreationTimestamp="2025-12-16 07:27:56 +0000 UTC" firstStartedPulling="2025-12-16 07:27:57.414644506 +0000 UTC m=+2215.676532135" lastFinishedPulling="2025-12-16 07:27:59.866498686 +0000 UTC m=+2218.128386315" observedRunningTime="2025-12-16 07:28:00.459256895 +0000 UTC m=+2218.721144524" watchObservedRunningTime="2025-12-16 07:28:00.465015256 +0000 UTC m=+2218.726902895" Dec 16 07:28:06 crc kubenswrapper[4789]: I1216 07:28:06.746513 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:28:06 crc kubenswrapper[4789]: I1216 07:28:06.747221 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:28:06 crc kubenswrapper[4789]: I1216 07:28:06.785888 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:28:07 crc kubenswrapper[4789]: I1216 07:28:07.533475 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:28:07 crc kubenswrapper[4789]: I1216 07:28:07.588597 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxmwx"] Dec 16 07:28:09 crc kubenswrapper[4789]: I1216 07:28:09.505739 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cxmwx" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="registry-server" containerID="cri-o://41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640" gracePeriod=2 Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.421302 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.441141 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-utilities\") pod \"4d1e8b75-2021-485e-a26b-4a45286fb421\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.441287 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flsv4\" (UniqueName: \"kubernetes.io/projected/4d1e8b75-2021-485e-a26b-4a45286fb421-kube-api-access-flsv4\") pod \"4d1e8b75-2021-485e-a26b-4a45286fb421\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.441314 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-catalog-content\") pod \"4d1e8b75-2021-485e-a26b-4a45286fb421\" (UID: \"4d1e8b75-2021-485e-a26b-4a45286fb421\") " Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.442592 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-utilities" (OuterVolumeSpecName: "utilities") pod "4d1e8b75-2021-485e-a26b-4a45286fb421" (UID: "4d1e8b75-2021-485e-a26b-4a45286fb421"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.447160 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1e8b75-2021-485e-a26b-4a45286fb421-kube-api-access-flsv4" (OuterVolumeSpecName: "kube-api-access-flsv4") pod "4d1e8b75-2021-485e-a26b-4a45286fb421" (UID: "4d1e8b75-2021-485e-a26b-4a45286fb421"). InnerVolumeSpecName "kube-api-access-flsv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.485243 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d1e8b75-2021-485e-a26b-4a45286fb421" (UID: "4d1e8b75-2021-485e-a26b-4a45286fb421"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.514128 4789 generic.go:334] "Generic (PLEG): container finished" podID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerID="41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640" exitCode=0 Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.514985 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxmwx" event={"ID":"4d1e8b75-2021-485e-a26b-4a45286fb421","Type":"ContainerDied","Data":"41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640"} Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.515368 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxmwx" event={"ID":"4d1e8b75-2021-485e-a26b-4a45286fb421","Type":"ContainerDied","Data":"0a6744628d257b92b75be27fe2cd9abfa28eaf4a2148ef0e2f30dae990675e6b"} Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.515468 4789 scope.go:117] "RemoveContainer" containerID="41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.515065 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxmwx" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.538507 4789 scope.go:117] "RemoveContainer" containerID="9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.542455 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flsv4\" (UniqueName: \"kubernetes.io/projected/4d1e8b75-2021-485e-a26b-4a45286fb421-kube-api-access-flsv4\") on node \"crc\" DevicePath \"\"" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.542477 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.542486 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1e8b75-2021-485e-a26b-4a45286fb421-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.563577 4789 scope.go:117] "RemoveContainer" containerID="ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.579027 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxmwx"] Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.580851 4789 scope.go:117] "RemoveContainer" containerID="41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640" Dec 16 07:28:10 crc kubenswrapper[4789]: E1216 07:28:10.581922 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640\": container with ID starting with 41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640 not found: ID does not exist" containerID="41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.582056 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640"} err="failed to get container status \"41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640\": rpc error: code = NotFound desc = could not find container \"41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640\": container with ID starting with 41638ab7fed66e2366eb62bf72ca6de3756bc691b1cc92308888291fad841640 not found: ID does not exist" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.582165 4789 scope.go:117] "RemoveContainer" containerID="9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe" Dec 16 07:28:10 crc kubenswrapper[4789]: E1216 07:28:10.582639 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe\": container with ID starting with 9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe not found: ID does not exist" containerID="9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.582673 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe"} err="failed to get container status \"9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe\": rpc error: code = NotFound desc = could not find container \"9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe\": container with ID starting with 9c3de429a649d8dc72e84378991b102baf0b6b3657e68435a96c41a48f0b2ebe not found: ID does not exist" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.582694 4789 scope.go:117] "RemoveContainer" containerID="ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318" Dec 16 07:28:10 crc kubenswrapper[4789]: E1216 07:28:10.582894 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318\": container with ID starting with ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318 not found: ID does not exist" containerID="ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.582933 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318"} err="failed to get container status \"ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318\": rpc error: code = NotFound desc = could not find container \"ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318\": container with ID starting with ba74ed37653f5304315528bc71d8983f1f7cae954bce3de2ac8a3bb6c4f48318 not found: ID does not exist" Dec 16 07:28:10 crc kubenswrapper[4789]: I1216 07:28:10.585975 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxmwx"] Dec 16 07:28:12 crc kubenswrapper[4789]: I1216 07:28:12.113252 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" path="/var/lib/kubelet/pods/4d1e8b75-2021-485e-a26b-4a45286fb421/volumes" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.174444 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp"] Dec 16 07:30:00 crc kubenswrapper[4789]: E1216 07:30:00.175590 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="extract-content" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.175614 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="extract-content" Dec 16 07:30:00 crc kubenswrapper[4789]: E1216 07:30:00.175638 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="extract-utilities" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.175652 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="extract-utilities" Dec 16 07:30:00 crc kubenswrapper[4789]: E1216 07:30:00.175675 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="registry-server" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.175687 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="registry-server" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.175935 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1e8b75-2021-485e-a26b-4a45286fb421" containerName="registry-server" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.176623 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.180218 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.183003 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.185339 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp"] Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.326174 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6xf\" (UniqueName: \"kubernetes.io/projected/3e1079d4-ef8d-4d61-a1da-1616004f8c66-kube-api-access-tx6xf\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.326510 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e1079d4-ef8d-4d61-a1da-1616004f8c66-config-volume\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.326609 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e1079d4-ef8d-4d61-a1da-1616004f8c66-secret-volume\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.427778 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e1079d4-ef8d-4d61-a1da-1616004f8c66-config-volume\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.427841 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e1079d4-ef8d-4d61-a1da-1616004f8c66-secret-volume\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.427891 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6xf\" (UniqueName: \"kubernetes.io/projected/3e1079d4-ef8d-4d61-a1da-1616004f8c66-kube-api-access-tx6xf\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.428699 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e1079d4-ef8d-4d61-a1da-1616004f8c66-config-volume\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.433994 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e1079d4-ef8d-4d61-a1da-1616004f8c66-secret-volume\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.444604 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6xf\" (UniqueName: \"kubernetes.io/projected/3e1079d4-ef8d-4d61-a1da-1616004f8c66-kube-api-access-tx6xf\") pod \"collect-profiles-29431170-kshmp\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.496965 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:00 crc kubenswrapper[4789]: I1216 07:30:00.941393 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp"] Dec 16 07:30:01 crc kubenswrapper[4789]: I1216 07:30:01.273041 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" event={"ID":"3e1079d4-ef8d-4d61-a1da-1616004f8c66","Type":"ContainerStarted","Data":"ed517f2381834c54c99a7e402bac8b0d9fb902b884854bb93612702a90eb4b4c"} Dec 16 07:30:01 crc kubenswrapper[4789]: I1216 07:30:01.273927 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" event={"ID":"3e1079d4-ef8d-4d61-a1da-1616004f8c66","Type":"ContainerStarted","Data":"061ae3fbcf886d44bef0aecdadfaf05eb5757ae9b33e65cfeeb0f0bb01834939"} Dec 16 07:30:01 crc kubenswrapper[4789]: I1216 07:30:01.288397 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" podStartSLOduration=1.28837678 podStartE2EDuration="1.28837678s" podCreationTimestamp="2025-12-16 07:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 07:30:01.286432123 +0000 UTC m=+2339.548319782" watchObservedRunningTime="2025-12-16 07:30:01.28837678 +0000 UTC m=+2339.550264409" Dec 16 07:30:02 crc kubenswrapper[4789]: I1216 07:30:02.280060 4789 generic.go:334] "Generic (PLEG): container finished" podID="3e1079d4-ef8d-4d61-a1da-1616004f8c66" containerID="ed517f2381834c54c99a7e402bac8b0d9fb902b884854bb93612702a90eb4b4c" exitCode=0 Dec 16 07:30:02 crc kubenswrapper[4789]: I1216 07:30:02.280224 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" event={"ID":"3e1079d4-ef8d-4d61-a1da-1616004f8c66","Type":"ContainerDied","Data":"ed517f2381834c54c99a7e402bac8b0d9fb902b884854bb93612702a90eb4b4c"} Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.528460 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.668578 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e1079d4-ef8d-4d61-a1da-1616004f8c66-secret-volume\") pod \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.668704 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx6xf\" (UniqueName: \"kubernetes.io/projected/3e1079d4-ef8d-4d61-a1da-1616004f8c66-kube-api-access-tx6xf\") pod \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.668826 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e1079d4-ef8d-4d61-a1da-1616004f8c66-config-volume\") pod \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\" (UID: \"3e1079d4-ef8d-4d61-a1da-1616004f8c66\") " Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.669671 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1079d4-ef8d-4d61-a1da-1616004f8c66-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e1079d4-ef8d-4d61-a1da-1616004f8c66" (UID: "3e1079d4-ef8d-4d61-a1da-1616004f8c66"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.674442 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1079d4-ef8d-4d61-a1da-1616004f8c66-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e1079d4-ef8d-4d61-a1da-1616004f8c66" (UID: "3e1079d4-ef8d-4d61-a1da-1616004f8c66"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.674482 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1079d4-ef8d-4d61-a1da-1616004f8c66-kube-api-access-tx6xf" (OuterVolumeSpecName: "kube-api-access-tx6xf") pod "3e1079d4-ef8d-4d61-a1da-1616004f8c66" (UID: "3e1079d4-ef8d-4d61-a1da-1616004f8c66"). InnerVolumeSpecName "kube-api-access-tx6xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.770418 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e1079d4-ef8d-4d61-a1da-1616004f8c66-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.770457 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx6xf\" (UniqueName: \"kubernetes.io/projected/3e1079d4-ef8d-4d61-a1da-1616004f8c66-kube-api-access-tx6xf\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:03 crc kubenswrapper[4789]: I1216 07:30:03.770470 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e1079d4-ef8d-4d61-a1da-1616004f8c66-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:30:04 crc kubenswrapper[4789]: I1216 07:30:04.294092 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" event={"ID":"3e1079d4-ef8d-4d61-a1da-1616004f8c66","Type":"ContainerDied","Data":"061ae3fbcf886d44bef0aecdadfaf05eb5757ae9b33e65cfeeb0f0bb01834939"} Dec 16 07:30:04 crc kubenswrapper[4789]: I1216 07:30:04.294407 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="061ae3fbcf886d44bef0aecdadfaf05eb5757ae9b33e65cfeeb0f0bb01834939" Dec 16 07:30:04 crc kubenswrapper[4789]: I1216 07:30:04.294144 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp" Dec 16 07:30:04 crc kubenswrapper[4789]: I1216 07:30:04.355309 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv"] Dec 16 07:30:04 crc kubenswrapper[4789]: I1216 07:30:04.360435 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431125-8wqnv"] Dec 16 07:30:06 crc kubenswrapper[4789]: I1216 07:30:06.116241 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6e8c5a-b937-4072-902d-28e056de16d2" path="/var/lib/kubelet/pods/4e6e8c5a-b937-4072-902d-28e056de16d2/volumes" Dec 16 07:30:21 crc kubenswrapper[4789]: I1216 07:30:21.927624 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:30:21 crc kubenswrapper[4789]: I1216 07:30:21.928234 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:30:27 crc kubenswrapper[4789]: I1216 07:30:27.939067 4789 scope.go:117] "RemoveContainer" containerID="c8e96bffcf0ba6f28aabbf35de857a380e1bf23c8d8f14b3997302cf29b1bf22" Dec 16 07:30:51 crc kubenswrapper[4789]: I1216 07:30:51.928471 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:30:51 crc kubenswrapper[4789]: I1216 07:30:51.929179 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:31:21 crc kubenswrapper[4789]: I1216 07:31:21.927970 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:31:21 crc kubenswrapper[4789]: I1216 07:31:21.928453 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:31:21 crc kubenswrapper[4789]: I1216 07:31:21.928493 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:31:21 crc kubenswrapper[4789]: I1216 07:31:21.928897 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:31:21 crc kubenswrapper[4789]: I1216 07:31:21.928966 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" gracePeriod=600 Dec 16 07:31:22 crc kubenswrapper[4789]: E1216 07:31:22.053333 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:31:22 crc kubenswrapper[4789]: I1216 07:31:22.855801 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" exitCode=0 Dec 16 07:31:22 crc kubenswrapper[4789]: I1216 07:31:22.855853 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff"} Dec 16 07:31:22 crc kubenswrapper[4789]: I1216 07:31:22.855891 4789 scope.go:117] "RemoveContainer" containerID="a9bab3fc5a6efc59a48aae9a5179bf05df99f772dee59fd9fcec844e562d2c74" Dec 16 07:31:22 crc kubenswrapper[4789]: I1216 07:31:22.856466 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:31:22 crc kubenswrapper[4789]: E1216 07:31:22.856845 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:31:36 crc kubenswrapper[4789]: I1216 07:31:36.105224 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:31:36 crc kubenswrapper[4789]: E1216 07:31:36.105891 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:31:51 crc kubenswrapper[4789]: I1216 07:31:51.105267 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:31:51 crc kubenswrapper[4789]: E1216 07:31:51.106393 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.247553 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjvq8"] Dec 16 07:32:03 crc kubenswrapper[4789]: E1216 07:32:03.248323 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1079d4-ef8d-4d61-a1da-1616004f8c66" containerName="collect-profiles" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.248336 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1079d4-ef8d-4d61-a1da-1616004f8c66" containerName="collect-profiles" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.248478 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1079d4-ef8d-4d61-a1da-1616004f8c66" containerName="collect-profiles" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.249430 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.253361 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjvq8"] Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.421482 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfrl\" (UniqueName: \"kubernetes.io/projected/642f3855-99de-4f55-b989-9c8d79599439-kube-api-access-vbfrl\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.421573 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-catalog-content\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.421674 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-utilities\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.522729 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbfrl\" (UniqueName: \"kubernetes.io/projected/642f3855-99de-4f55-b989-9c8d79599439-kube-api-access-vbfrl\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.522816 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-catalog-content\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.522848 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-utilities\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.523290 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-utilities\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.523601 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-catalog-content\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.543835 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbfrl\" (UniqueName: \"kubernetes.io/projected/642f3855-99de-4f55-b989-9c8d79599439-kube-api-access-vbfrl\") pod \"redhat-operators-zjvq8\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.572636 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:03 crc kubenswrapper[4789]: I1216 07:32:03.994413 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjvq8"] Dec 16 07:32:04 crc kubenswrapper[4789]: I1216 07:32:04.104584 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:32:04 crc kubenswrapper[4789]: E1216 07:32:04.105142 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:32:04 crc kubenswrapper[4789]: I1216 07:32:04.152446 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerStarted","Data":"7c9e4e3fb5af5c5c905a79c7393ce8aa41313d7cb15af3a4598ef6b440eeca3d"} Dec 16 07:32:04 crc kubenswrapper[4789]: I1216 07:32:04.152490 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerStarted","Data":"02eaa5292091012e7dadf8e24319186cc29c5f7436900d6d98db6ed6b9892dde"} Dec 16 07:32:05 crc kubenswrapper[4789]: I1216 07:32:05.163163 4789 generic.go:334] "Generic (PLEG): container finished" podID="642f3855-99de-4f55-b989-9c8d79599439" containerID="7c9e4e3fb5af5c5c905a79c7393ce8aa41313d7cb15af3a4598ef6b440eeca3d" exitCode=0 Dec 16 07:32:05 crc kubenswrapper[4789]: I1216 07:32:05.163295 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerDied","Data":"7c9e4e3fb5af5c5c905a79c7393ce8aa41313d7cb15af3a4598ef6b440eeca3d"} Dec 16 07:32:06 crc kubenswrapper[4789]: I1216 07:32:06.171293 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerStarted","Data":"e7d3e8e5554b9ea7814d18de325b2a18596b28e847269fc636d0b80d8aacc521"} Dec 16 07:32:07 crc kubenswrapper[4789]: I1216 07:32:07.179993 4789 generic.go:334] "Generic (PLEG): container finished" podID="642f3855-99de-4f55-b989-9c8d79599439" containerID="e7d3e8e5554b9ea7814d18de325b2a18596b28e847269fc636d0b80d8aacc521" exitCode=0 Dec 16 07:32:07 crc kubenswrapper[4789]: I1216 07:32:07.180091 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerDied","Data":"e7d3e8e5554b9ea7814d18de325b2a18596b28e847269fc636d0b80d8aacc521"} Dec 16 07:32:08 crc kubenswrapper[4789]: I1216 07:32:08.191218 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerStarted","Data":"da6db7c092f387430ec3cef1a900c2b8c1b9d01fa2121f18af8c2f94b7e401dd"} Dec 16 07:32:08 crc kubenswrapper[4789]: I1216 07:32:08.212001 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjvq8" podStartSLOduration=2.794709204 podStartE2EDuration="5.211980425s" podCreationTimestamp="2025-12-16 07:32:03 +0000 UTC" firstStartedPulling="2025-12-16 07:32:05.167017835 +0000 UTC m=+2463.428905474" lastFinishedPulling="2025-12-16 07:32:07.584289066 +0000 UTC m=+2465.846176695" observedRunningTime="2025-12-16 07:32:08.206564733 +0000 UTC m=+2466.468452372" watchObservedRunningTime="2025-12-16 07:32:08.211980425 +0000 UTC m=+2466.473868074" Dec 16 07:32:12 crc kubenswrapper[4789]: I1216 07:32:12.969409 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j9sk4"] Dec 16 07:32:12 crc kubenswrapper[4789]: I1216 07:32:12.978065 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.002512 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9sk4"] Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.066265 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-catalog-content\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.066362 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cdv2\" (UniqueName: \"kubernetes.io/projected/341e149a-ee61-44e8-8645-c5fc7452430d-kube-api-access-9cdv2\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.066384 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-utilities\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.167096 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdv2\" (UniqueName: \"kubernetes.io/projected/341e149a-ee61-44e8-8645-c5fc7452430d-kube-api-access-9cdv2\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.167142 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-utilities\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.167210 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-catalog-content\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.167806 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-catalog-content\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.167817 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-utilities\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.191161 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cdv2\" (UniqueName: \"kubernetes.io/projected/341e149a-ee61-44e8-8645-c5fc7452430d-kube-api-access-9cdv2\") pod \"certified-operators-j9sk4\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.307589 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.573589 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.573865 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.634267 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:13 crc kubenswrapper[4789]: I1216 07:32:13.793820 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9sk4"] Dec 16 07:32:14 crc kubenswrapper[4789]: I1216 07:32:14.227326 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9sk4" event={"ID":"341e149a-ee61-44e8-8645-c5fc7452430d","Type":"ContainerStarted","Data":"b2d5cc2aaea1689c7d186f59ee7d2dbce8815c7f4f6b3b12d756106ea1e5e879"} Dec 16 07:32:14 crc kubenswrapper[4789]: I1216 07:32:14.266159 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:14 crc kubenswrapper[4789]: I1216 07:32:14.762752 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjvq8"] Dec 16 07:32:15 crc kubenswrapper[4789]: I1216 07:32:15.236634 4789 generic.go:334] "Generic (PLEG): container finished" podID="341e149a-ee61-44e8-8645-c5fc7452430d" containerID="51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210" exitCode=0 Dec 16 07:32:15 crc kubenswrapper[4789]: I1216 07:32:15.236731 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9sk4" event={"ID":"341e149a-ee61-44e8-8645-c5fc7452430d","Type":"ContainerDied","Data":"51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210"} Dec 16 07:32:16 crc kubenswrapper[4789]: I1216 07:32:16.245719 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9sk4" event={"ID":"341e149a-ee61-44e8-8645-c5fc7452430d","Type":"ContainerStarted","Data":"97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029"} Dec 16 07:32:16 crc kubenswrapper[4789]: I1216 07:32:16.245836 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zjvq8" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="registry-server" containerID="cri-o://da6db7c092f387430ec3cef1a900c2b8c1b9d01fa2121f18af8c2f94b7e401dd" gracePeriod=2 Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.104887 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:32:18 crc kubenswrapper[4789]: E1216 07:32:18.105420 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.260674 4789 generic.go:334] "Generic (PLEG): container finished" podID="642f3855-99de-4f55-b989-9c8d79599439" containerID="da6db7c092f387430ec3cef1a900c2b8c1b9d01fa2121f18af8c2f94b7e401dd" exitCode=0 Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.260746 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerDied","Data":"da6db7c092f387430ec3cef1a900c2b8c1b9d01fa2121f18af8c2f94b7e401dd"} Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.263107 4789 generic.go:334] "Generic (PLEG): container finished" podID="341e149a-ee61-44e8-8645-c5fc7452430d" containerID="97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029" exitCode=0 Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.263136 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9sk4" event={"ID":"341e149a-ee61-44e8-8645-c5fc7452430d","Type":"ContainerDied","Data":"97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029"} Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.826071 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.945422 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-catalog-content\") pod \"642f3855-99de-4f55-b989-9c8d79599439\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.945511 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-utilities\") pod \"642f3855-99de-4f55-b989-9c8d79599439\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.945585 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbfrl\" (UniqueName: \"kubernetes.io/projected/642f3855-99de-4f55-b989-9c8d79599439-kube-api-access-vbfrl\") pod \"642f3855-99de-4f55-b989-9c8d79599439\" (UID: \"642f3855-99de-4f55-b989-9c8d79599439\") " Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.946681 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-utilities" (OuterVolumeSpecName: "utilities") pod "642f3855-99de-4f55-b989-9c8d79599439" (UID: "642f3855-99de-4f55-b989-9c8d79599439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:32:18 crc kubenswrapper[4789]: I1216 07:32:18.957168 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642f3855-99de-4f55-b989-9c8d79599439-kube-api-access-vbfrl" (OuterVolumeSpecName: "kube-api-access-vbfrl") pod "642f3855-99de-4f55-b989-9c8d79599439" (UID: "642f3855-99de-4f55-b989-9c8d79599439"). InnerVolumeSpecName "kube-api-access-vbfrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.047574 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.047617 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbfrl\" (UniqueName: \"kubernetes.io/projected/642f3855-99de-4f55-b989-9c8d79599439-kube-api-access-vbfrl\") on node \"crc\" DevicePath \"\"" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.053573 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "642f3855-99de-4f55-b989-9c8d79599439" (UID: "642f3855-99de-4f55-b989-9c8d79599439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.149287 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/642f3855-99de-4f55-b989-9c8d79599439-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.271229 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9sk4" event={"ID":"341e149a-ee61-44e8-8645-c5fc7452430d","Type":"ContainerStarted","Data":"84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60"} Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.275154 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjvq8" event={"ID":"642f3855-99de-4f55-b989-9c8d79599439","Type":"ContainerDied","Data":"02eaa5292091012e7dadf8e24319186cc29c5f7436900d6d98db6ed6b9892dde"} Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.275216 4789 scope.go:117] "RemoveContainer" containerID="da6db7c092f387430ec3cef1a900c2b8c1b9d01fa2121f18af8c2f94b7e401dd" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.275223 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjvq8" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.293975 4789 scope.go:117] "RemoveContainer" containerID="e7d3e8e5554b9ea7814d18de325b2a18596b28e847269fc636d0b80d8aacc521" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.296838 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j9sk4" podStartSLOduration=3.867129853 podStartE2EDuration="7.296823553s" podCreationTimestamp="2025-12-16 07:32:12 +0000 UTC" firstStartedPulling="2025-12-16 07:32:15.238541554 +0000 UTC m=+2473.500429173" lastFinishedPulling="2025-12-16 07:32:18.668235244 +0000 UTC m=+2476.930122873" observedRunningTime="2025-12-16 07:32:19.291509614 +0000 UTC m=+2477.553397253" watchObservedRunningTime="2025-12-16 07:32:19.296823553 +0000 UTC m=+2477.558711182" Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.312089 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjvq8"] Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.316017 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zjvq8"] Dec 16 07:32:19 crc kubenswrapper[4789]: I1216 07:32:19.330211 4789 scope.go:117] "RemoveContainer" containerID="7c9e4e3fb5af5c5c905a79c7393ce8aa41313d7cb15af3a4598ef6b440eeca3d" Dec 16 07:32:20 crc kubenswrapper[4789]: I1216 07:32:20.122094 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="642f3855-99de-4f55-b989-9c8d79599439" path="/var/lib/kubelet/pods/642f3855-99de-4f55-b989-9c8d79599439/volumes" Dec 16 07:32:23 crc kubenswrapper[4789]: I1216 07:32:23.308098 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:23 crc kubenswrapper[4789]: I1216 07:32:23.308453 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:23 crc kubenswrapper[4789]: I1216 07:32:23.346025 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:24 crc kubenswrapper[4789]: I1216 07:32:24.356703 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:24 crc kubenswrapper[4789]: I1216 07:32:24.772561 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9sk4"] Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.320535 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j9sk4" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="registry-server" containerID="cri-o://84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60" gracePeriod=2 Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.683522 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.781815 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-utilities\") pod \"341e149a-ee61-44e8-8645-c5fc7452430d\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.781895 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-catalog-content\") pod \"341e149a-ee61-44e8-8645-c5fc7452430d\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.781955 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cdv2\" (UniqueName: \"kubernetes.io/projected/341e149a-ee61-44e8-8645-c5fc7452430d-kube-api-access-9cdv2\") pod \"341e149a-ee61-44e8-8645-c5fc7452430d\" (UID: \"341e149a-ee61-44e8-8645-c5fc7452430d\") " Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.783237 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-utilities" (OuterVolumeSpecName: "utilities") pod "341e149a-ee61-44e8-8645-c5fc7452430d" (UID: "341e149a-ee61-44e8-8645-c5fc7452430d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.787947 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/341e149a-ee61-44e8-8645-c5fc7452430d-kube-api-access-9cdv2" (OuterVolumeSpecName: "kube-api-access-9cdv2") pod "341e149a-ee61-44e8-8645-c5fc7452430d" (UID: "341e149a-ee61-44e8-8645-c5fc7452430d"). InnerVolumeSpecName "kube-api-access-9cdv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.883146 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.883178 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cdv2\" (UniqueName: \"kubernetes.io/projected/341e149a-ee61-44e8-8645-c5fc7452430d-kube-api-access-9cdv2\") on node \"crc\" DevicePath \"\"" Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.927624 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "341e149a-ee61-44e8-8645-c5fc7452430d" (UID: "341e149a-ee61-44e8-8645-c5fc7452430d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:32:26 crc kubenswrapper[4789]: I1216 07:32:26.984248 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341e149a-ee61-44e8-8645-c5fc7452430d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.329730 4789 generic.go:334] "Generic (PLEG): container finished" podID="341e149a-ee61-44e8-8645-c5fc7452430d" containerID="84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60" exitCode=0 Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.329789 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9sk4" event={"ID":"341e149a-ee61-44e8-8645-c5fc7452430d","Type":"ContainerDied","Data":"84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60"} Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.329821 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9sk4" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.329844 4789 scope.go:117] "RemoveContainer" containerID="84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.329829 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9sk4" event={"ID":"341e149a-ee61-44e8-8645-c5fc7452430d","Type":"ContainerDied","Data":"b2d5cc2aaea1689c7d186f59ee7d2dbce8815c7f4f6b3b12d756106ea1e5e879"} Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.351779 4789 scope.go:117] "RemoveContainer" containerID="97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.380360 4789 scope.go:117] "RemoveContainer" containerID="51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.380454 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9sk4"] Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.389501 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j9sk4"] Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.417383 4789 scope.go:117] "RemoveContainer" containerID="84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60" Dec 16 07:32:27 crc kubenswrapper[4789]: E1216 07:32:27.417811 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60\": container with ID starting with 84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60 not found: ID does not exist" containerID="84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.417849 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60"} err="failed to get container status \"84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60\": rpc error: code = NotFound desc = could not find container \"84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60\": container with ID starting with 84a0f51ceaf18cb0cf1ba9baf75835030c1bdc355fb6c40751f31af2dcec2d60 not found: ID does not exist" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.417874 4789 scope.go:117] "RemoveContainer" containerID="97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029" Dec 16 07:32:27 crc kubenswrapper[4789]: E1216 07:32:27.418229 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029\": container with ID starting with 97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029 not found: ID does not exist" containerID="97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.418257 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029"} err="failed to get container status \"97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029\": rpc error: code = NotFound desc = could not find container \"97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029\": container with ID starting with 97f2050b3b97a379b2bf377054c7f17343759730fc7f726d19da8c3a6ca1b029 not found: ID does not exist" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.418275 4789 scope.go:117] "RemoveContainer" containerID="51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210" Dec 16 07:32:27 crc kubenswrapper[4789]: E1216 07:32:27.418697 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210\": container with ID starting with 51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210 not found: ID does not exist" containerID="51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210" Dec 16 07:32:27 crc kubenswrapper[4789]: I1216 07:32:27.418759 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210"} err="failed to get container status \"51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210\": rpc error: code = NotFound desc = could not find container \"51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210\": container with ID starting with 51c17a86fbc701ed64a89208d7a8f43c2741313604410355aedbed2ece686210 not found: ID does not exist" Dec 16 07:32:28 crc kubenswrapper[4789]: I1216 07:32:28.115134 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" path="/var/lib/kubelet/pods/341e149a-ee61-44e8-8645-c5fc7452430d/volumes" Dec 16 07:32:29 crc kubenswrapper[4789]: I1216 07:32:29.105740 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:32:29 crc kubenswrapper[4789]: E1216 07:32:29.108290 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:32:40 crc kubenswrapper[4789]: I1216 07:32:40.105091 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:32:40 crc kubenswrapper[4789]: E1216 07:32:40.105657 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:32:51 crc kubenswrapper[4789]: I1216 07:32:51.105162 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:32:51 crc kubenswrapper[4789]: E1216 07:32:51.105851 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:33:04 crc kubenswrapper[4789]: I1216 07:33:04.105902 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:33:04 crc kubenswrapper[4789]: E1216 07:33:04.107218 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:33:16 crc kubenswrapper[4789]: I1216 07:33:16.104770 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:33:16 crc kubenswrapper[4789]: E1216 07:33:16.105540 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:33:31 crc kubenswrapper[4789]: I1216 07:33:31.106294 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:33:31 crc kubenswrapper[4789]: E1216 07:33:31.107101 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:33:44 crc kubenswrapper[4789]: I1216 07:33:44.105484 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:33:44 crc kubenswrapper[4789]: E1216 07:33:44.107141 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:33:55 crc kubenswrapper[4789]: I1216 07:33:55.106530 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:33:55 crc kubenswrapper[4789]: E1216 07:33:55.107656 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:34:08 crc kubenswrapper[4789]: I1216 07:34:08.104732 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:34:08 crc kubenswrapper[4789]: E1216 07:34:08.105860 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:34:23 crc kubenswrapper[4789]: I1216 07:34:23.105643 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:34:23 crc kubenswrapper[4789]: E1216 07:34:23.106475 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:34:37 crc kubenswrapper[4789]: I1216 07:34:37.105486 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:34:37 crc kubenswrapper[4789]: E1216 07:34:37.106269 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:34:49 crc kubenswrapper[4789]: I1216 07:34:49.105567 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:34:49 crc kubenswrapper[4789]: E1216 07:34:49.106454 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:35:04 crc kubenswrapper[4789]: I1216 07:35:04.104993 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:35:04 crc kubenswrapper[4789]: E1216 07:35:04.105717 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:35:15 crc kubenswrapper[4789]: I1216 07:35:15.105186 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:35:15 crc kubenswrapper[4789]: E1216 07:35:15.106021 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:35:27 crc kubenswrapper[4789]: I1216 07:35:27.105854 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:35:27 crc kubenswrapper[4789]: E1216 07:35:27.106653 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:35:42 crc kubenswrapper[4789]: I1216 07:35:42.114249 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:35:42 crc kubenswrapper[4789]: E1216 07:35:42.115889 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:35:56 crc kubenswrapper[4789]: I1216 07:35:56.104608 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:35:56 crc kubenswrapper[4789]: E1216 07:35:56.105574 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:36:07 crc kubenswrapper[4789]: I1216 07:36:07.105102 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:36:07 crc kubenswrapper[4789]: E1216 07:36:07.106058 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:36:19 crc kubenswrapper[4789]: I1216 07:36:19.104432 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:36:19 crc kubenswrapper[4789]: E1216 07:36:19.105013 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:36:34 crc kubenswrapper[4789]: I1216 07:36:34.104665 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:36:35 crc kubenswrapper[4789]: I1216 07:36:35.061166 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"00c502ea503e8c21f2c7d1df2010a917580b418969a994f3e2aa4f910d4ae95c"} Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.069605 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7z6j2"] Dec 16 07:37:57 crc kubenswrapper[4789]: E1216 07:37:57.070530 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="extract-utilities" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070548 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="extract-utilities" Dec 16 07:37:57 crc kubenswrapper[4789]: E1216 07:37:57.070567 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="registry-server" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070574 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="registry-server" Dec 16 07:37:57 crc kubenswrapper[4789]: E1216 07:37:57.070592 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="extract-content" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070600 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="extract-content" Dec 16 07:37:57 crc kubenswrapper[4789]: E1216 07:37:57.070612 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="extract-utilities" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070620 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="extract-utilities" Dec 16 07:37:57 crc kubenswrapper[4789]: E1216 07:37:57.070641 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="registry-server" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070649 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="registry-server" Dec 16 07:37:57 crc kubenswrapper[4789]: E1216 07:37:57.070661 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="extract-content" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070669 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="extract-content" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070818 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="642f3855-99de-4f55-b989-9c8d79599439" containerName="registry-server" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.070837 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="341e149a-ee61-44e8-8645-c5fc7452430d" containerName="registry-server" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.072164 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.077632 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7z6j2"] Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.102710 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz5tt\" (UniqueName: \"kubernetes.io/projected/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-kube-api-access-tz5tt\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.102813 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-utilities\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.102839 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-catalog-content\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.203933 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-utilities\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.204364 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-catalog-content\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.204558 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz5tt\" (UniqueName: \"kubernetes.io/projected/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-kube-api-access-tz5tt\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.204720 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-utilities\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.206162 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-catalog-content\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.225749 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz5tt\" (UniqueName: \"kubernetes.io/projected/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-kube-api-access-tz5tt\") pod \"community-operators-7z6j2\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.507905 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:37:57 crc kubenswrapper[4789]: I1216 07:37:57.972135 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7z6j2"] Dec 16 07:37:58 crc kubenswrapper[4789]: I1216 07:37:58.660757 4789 generic.go:334] "Generic (PLEG): container finished" podID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerID="3e787cff94ef8dcc26f851fd4e9a3680a2fdac987b8879c65b292f61447a4774" exitCode=0 Dec 16 07:37:58 crc kubenswrapper[4789]: I1216 07:37:58.660809 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z6j2" event={"ID":"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d","Type":"ContainerDied","Data":"3e787cff94ef8dcc26f851fd4e9a3680a2fdac987b8879c65b292f61447a4774"} Dec 16 07:37:58 crc kubenswrapper[4789]: I1216 07:37:58.660889 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z6j2" event={"ID":"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d","Type":"ContainerStarted","Data":"9d017c4efcb49ca8c81d29ab55afe36106080e5d02f58b46fcd681f9a9119b48"} Dec 16 07:37:58 crc kubenswrapper[4789]: I1216 07:37:58.663810 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:37:59 crc kubenswrapper[4789]: I1216 07:37:59.669414 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z6j2" event={"ID":"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d","Type":"ContainerStarted","Data":"4d384e968104a060d1fb9a9bd6adf89d88f381861749bcba667c85cedc30559d"} Dec 16 07:38:00 crc kubenswrapper[4789]: I1216 07:38:00.676648 4789 generic.go:334] "Generic (PLEG): container finished" podID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerID="4d384e968104a060d1fb9a9bd6adf89d88f381861749bcba667c85cedc30559d" exitCode=0 Dec 16 07:38:00 crc kubenswrapper[4789]: I1216 07:38:00.676697 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z6j2" event={"ID":"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d","Type":"ContainerDied","Data":"4d384e968104a060d1fb9a9bd6adf89d88f381861749bcba667c85cedc30559d"} Dec 16 07:38:01 crc kubenswrapper[4789]: I1216 07:38:01.684965 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z6j2" event={"ID":"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d","Type":"ContainerStarted","Data":"ed86558d74009442f10654cb49f3413f43aa25d329c4360ce027fbe3c8d9fcc1"} Dec 16 07:38:01 crc kubenswrapper[4789]: I1216 07:38:01.704251 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7z6j2" podStartSLOduration=2.237553572 podStartE2EDuration="4.704229655s" podCreationTimestamp="2025-12-16 07:37:57 +0000 UTC" firstStartedPulling="2025-12-16 07:37:58.663092804 +0000 UTC m=+2816.924980433" lastFinishedPulling="2025-12-16 07:38:01.129768887 +0000 UTC m=+2819.391656516" observedRunningTime="2025-12-16 07:38:01.701098888 +0000 UTC m=+2819.962986537" watchObservedRunningTime="2025-12-16 07:38:01.704229655 +0000 UTC m=+2819.966117284" Dec 16 07:38:07 crc kubenswrapper[4789]: I1216 07:38:07.508835 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:38:07 crc kubenswrapper[4789]: I1216 07:38:07.511353 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:38:07 crc kubenswrapper[4789]: I1216 07:38:07.554872 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:38:07 crc kubenswrapper[4789]: I1216 07:38:07.761461 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:38:07 crc kubenswrapper[4789]: I1216 07:38:07.811991 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7z6j2"] Dec 16 07:38:09 crc kubenswrapper[4789]: I1216 07:38:09.734415 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7z6j2" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="registry-server" containerID="cri-o://ed86558d74009442f10654cb49f3413f43aa25d329c4360ce027fbe3c8d9fcc1" gracePeriod=2 Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.743312 4789 generic.go:334] "Generic (PLEG): container finished" podID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerID="ed86558d74009442f10654cb49f3413f43aa25d329c4360ce027fbe3c8d9fcc1" exitCode=0 Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.743448 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z6j2" event={"ID":"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d","Type":"ContainerDied","Data":"ed86558d74009442f10654cb49f3413f43aa25d329c4360ce027fbe3c8d9fcc1"} Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.743760 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7z6j2" event={"ID":"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d","Type":"ContainerDied","Data":"9d017c4efcb49ca8c81d29ab55afe36106080e5d02f58b46fcd681f9a9119b48"} Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.743779 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d017c4efcb49ca8c81d29ab55afe36106080e5d02f58b46fcd681f9a9119b48" Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.767744 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.878047 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-catalog-content\") pod \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.878213 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz5tt\" (UniqueName: \"kubernetes.io/projected/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-kube-api-access-tz5tt\") pod \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.878287 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-utilities\") pod \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\" (UID: \"d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d\") " Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.879646 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-utilities" (OuterVolumeSpecName: "utilities") pod "d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" (UID: "d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.883970 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-kube-api-access-tz5tt" (OuterVolumeSpecName: "kube-api-access-tz5tt") pod "d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" (UID: "d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d"). InnerVolumeSpecName "kube-api-access-tz5tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.931514 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" (UID: "d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.979897 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz5tt\" (UniqueName: \"kubernetes.io/projected/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-kube-api-access-tz5tt\") on node \"crc\" DevicePath \"\"" Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.979951 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:38:10 crc kubenswrapper[4789]: I1216 07:38:10.979964 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:38:11 crc kubenswrapper[4789]: I1216 07:38:11.749787 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7z6j2" Dec 16 07:38:11 crc kubenswrapper[4789]: I1216 07:38:11.780054 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7z6j2"] Dec 16 07:38:11 crc kubenswrapper[4789]: I1216 07:38:11.785827 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7z6j2"] Dec 16 07:38:12 crc kubenswrapper[4789]: I1216 07:38:12.112888 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" path="/var/lib/kubelet/pods/d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d/volumes" Dec 16 07:38:34 crc kubenswrapper[4789]: I1216 07:38:34.864786 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jglf4"] Dec 16 07:38:34 crc kubenswrapper[4789]: E1216 07:38:34.866313 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="extract-content" Dec 16 07:38:34 crc kubenswrapper[4789]: I1216 07:38:34.866342 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="extract-content" Dec 16 07:38:34 crc kubenswrapper[4789]: E1216 07:38:34.866357 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="extract-utilities" Dec 16 07:38:34 crc kubenswrapper[4789]: I1216 07:38:34.866366 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="extract-utilities" Dec 16 07:38:34 crc kubenswrapper[4789]: E1216 07:38:34.866384 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="registry-server" Dec 16 07:38:34 crc kubenswrapper[4789]: I1216 07:38:34.866396 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="registry-server" Dec 16 07:38:34 crc kubenswrapper[4789]: I1216 07:38:34.866605 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7467a6a-fa1f-4c0f-91f5-d59d0d686f0d" containerName="registry-server" Dec 16 07:38:34 crc kubenswrapper[4789]: I1216 07:38:34.868071 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:34 crc kubenswrapper[4789]: I1216 07:38:34.879785 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jglf4"] Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.018811 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-utilities\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.019263 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-catalog-content\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.019307 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hth78\" (UniqueName: \"kubernetes.io/projected/611f8b78-3ed5-4c6d-9522-ad167be21466-kube-api-access-hth78\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.120900 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-utilities\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.121017 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-catalog-content\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.121045 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hth78\" (UniqueName: \"kubernetes.io/projected/611f8b78-3ed5-4c6d-9522-ad167be21466-kube-api-access-hth78\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.121436 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-catalog-content\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.121545 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-utilities\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.139604 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hth78\" (UniqueName: \"kubernetes.io/projected/611f8b78-3ed5-4c6d-9522-ad167be21466-kube-api-access-hth78\") pod \"redhat-marketplace-jglf4\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.240330 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.499155 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jglf4"] Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.902251 4789 generic.go:334] "Generic (PLEG): container finished" podID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerID="0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb" exitCode=0 Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.902308 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jglf4" event={"ID":"611f8b78-3ed5-4c6d-9522-ad167be21466","Type":"ContainerDied","Data":"0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb"} Dec 16 07:38:35 crc kubenswrapper[4789]: I1216 07:38:35.902351 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jglf4" event={"ID":"611f8b78-3ed5-4c6d-9522-ad167be21466","Type":"ContainerStarted","Data":"86d9c5a5c95ec21d13f780fb8efcdadef14aa19798e78a1da2f3878e81ca8c4c"} Dec 16 07:38:37 crc kubenswrapper[4789]: I1216 07:38:37.916270 4789 generic.go:334] "Generic (PLEG): container finished" podID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerID="bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2" exitCode=0 Dec 16 07:38:37 crc kubenswrapper[4789]: I1216 07:38:37.916340 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jglf4" event={"ID":"611f8b78-3ed5-4c6d-9522-ad167be21466","Type":"ContainerDied","Data":"bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2"} Dec 16 07:38:38 crc kubenswrapper[4789]: I1216 07:38:38.925291 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jglf4" event={"ID":"611f8b78-3ed5-4c6d-9522-ad167be21466","Type":"ContainerStarted","Data":"e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78"} Dec 16 07:38:38 crc kubenswrapper[4789]: I1216 07:38:38.942429 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jglf4" podStartSLOduration=2.468758872 podStartE2EDuration="4.942402556s" podCreationTimestamp="2025-12-16 07:38:34 +0000 UTC" firstStartedPulling="2025-12-16 07:38:35.90352561 +0000 UTC m=+2854.165413249" lastFinishedPulling="2025-12-16 07:38:38.377169304 +0000 UTC m=+2856.639056933" observedRunningTime="2025-12-16 07:38:38.939973437 +0000 UTC m=+2857.201861066" watchObservedRunningTime="2025-12-16 07:38:38.942402556 +0000 UTC m=+2857.204290195" Dec 16 07:38:45 crc kubenswrapper[4789]: I1216 07:38:45.241173 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:45 crc kubenswrapper[4789]: I1216 07:38:45.242559 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:45 crc kubenswrapper[4789]: I1216 07:38:45.282170 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:46 crc kubenswrapper[4789]: I1216 07:38:46.009542 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:46 crc kubenswrapper[4789]: I1216 07:38:46.050364 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jglf4"] Dec 16 07:38:47 crc kubenswrapper[4789]: I1216 07:38:47.991583 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jglf4" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="registry-server" containerID="cri-o://e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78" gracePeriod=2 Dec 16 07:38:48 crc kubenswrapper[4789]: I1216 07:38:48.884306 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.000338 4789 generic.go:334] "Generic (PLEG): container finished" podID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerID="e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78" exitCode=0 Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.000412 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jglf4" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.000415 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jglf4" event={"ID":"611f8b78-3ed5-4c6d-9522-ad167be21466","Type":"ContainerDied","Data":"e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78"} Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.000842 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jglf4" event={"ID":"611f8b78-3ed5-4c6d-9522-ad167be21466","Type":"ContainerDied","Data":"86d9c5a5c95ec21d13f780fb8efcdadef14aa19798e78a1da2f3878e81ca8c4c"} Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.000861 4789 scope.go:117] "RemoveContainer" containerID="e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.019819 4789 scope.go:117] "RemoveContainer" containerID="bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.029638 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hth78\" (UniqueName: \"kubernetes.io/projected/611f8b78-3ed5-4c6d-9522-ad167be21466-kube-api-access-hth78\") pod \"611f8b78-3ed5-4c6d-9522-ad167be21466\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.029807 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-catalog-content\") pod \"611f8b78-3ed5-4c6d-9522-ad167be21466\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.029997 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-utilities\") pod \"611f8b78-3ed5-4c6d-9522-ad167be21466\" (UID: \"611f8b78-3ed5-4c6d-9522-ad167be21466\") " Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.031216 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-utilities" (OuterVolumeSpecName: "utilities") pod "611f8b78-3ed5-4c6d-9522-ad167be21466" (UID: "611f8b78-3ed5-4c6d-9522-ad167be21466"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.036832 4789 scope.go:117] "RemoveContainer" containerID="0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.039166 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611f8b78-3ed5-4c6d-9522-ad167be21466-kube-api-access-hth78" (OuterVolumeSpecName: "kube-api-access-hth78") pod "611f8b78-3ed5-4c6d-9522-ad167be21466" (UID: "611f8b78-3ed5-4c6d-9522-ad167be21466"). InnerVolumeSpecName "kube-api-access-hth78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.056161 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "611f8b78-3ed5-4c6d-9522-ad167be21466" (UID: "611f8b78-3ed5-4c6d-9522-ad167be21466"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.085374 4789 scope.go:117] "RemoveContainer" containerID="e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78" Dec 16 07:38:49 crc kubenswrapper[4789]: E1216 07:38:49.085891 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78\": container with ID starting with e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78 not found: ID does not exist" containerID="e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.085957 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78"} err="failed to get container status \"e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78\": rpc error: code = NotFound desc = could not find container \"e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78\": container with ID starting with e670412a3db7f4ba9c4dc960c2a5bfade452cc63b79a56dd46e6b09df476cb78 not found: ID does not exist" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.085984 4789 scope.go:117] "RemoveContainer" containerID="bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2" Dec 16 07:38:49 crc kubenswrapper[4789]: E1216 07:38:49.086411 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2\": container with ID starting with bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2 not found: ID does not exist" containerID="bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.086447 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2"} err="failed to get container status \"bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2\": rpc error: code = NotFound desc = could not find container \"bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2\": container with ID starting with bf211caefdcb43005d10fee6c390947c1f77be172e0999c81be6700b38e340d2 not found: ID does not exist" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.086470 4789 scope.go:117] "RemoveContainer" containerID="0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb" Dec 16 07:38:49 crc kubenswrapper[4789]: E1216 07:38:49.086948 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb\": container with ID starting with 0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb not found: ID does not exist" containerID="0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.086973 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb"} err="failed to get container status \"0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb\": rpc error: code = NotFound desc = could not find container \"0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb\": container with ID starting with 0479942abfa14953f88bee9da7e6c9ca42326791d6129b9d2ba779db0de056eb not found: ID does not exist" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.132085 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.132119 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611f8b78-3ed5-4c6d-9522-ad167be21466-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.132237 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hth78\" (UniqueName: \"kubernetes.io/projected/611f8b78-3ed5-4c6d-9522-ad167be21466-kube-api-access-hth78\") on node \"crc\" DevicePath \"\"" Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.334694 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jglf4"] Dec 16 07:38:49 crc kubenswrapper[4789]: I1216 07:38:49.340214 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jglf4"] Dec 16 07:38:50 crc kubenswrapper[4789]: I1216 07:38:50.114455 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" path="/var/lib/kubelet/pods/611f8b78-3ed5-4c6d-9522-ad167be21466/volumes" Dec 16 07:38:51 crc kubenswrapper[4789]: I1216 07:38:51.927400 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:38:51 crc kubenswrapper[4789]: I1216 07:38:51.927697 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:39:21 crc kubenswrapper[4789]: I1216 07:39:21.928209 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:39:21 crc kubenswrapper[4789]: I1216 07:39:21.928789 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:39:51 crc kubenswrapper[4789]: I1216 07:39:51.928241 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:39:51 crc kubenswrapper[4789]: I1216 07:39:51.928806 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:39:51 crc kubenswrapper[4789]: I1216 07:39:51.928850 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:39:51 crc kubenswrapper[4789]: I1216 07:39:51.929471 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00c502ea503e8c21f2c7d1df2010a917580b418969a994f3e2aa4f910d4ae95c"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:39:51 crc kubenswrapper[4789]: I1216 07:39:51.929523 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://00c502ea503e8c21f2c7d1df2010a917580b418969a994f3e2aa4f910d4ae95c" gracePeriod=600 Dec 16 07:39:52 crc kubenswrapper[4789]: I1216 07:39:52.436841 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="00c502ea503e8c21f2c7d1df2010a917580b418969a994f3e2aa4f910d4ae95c" exitCode=0 Dec 16 07:39:52 crc kubenswrapper[4789]: I1216 07:39:52.436943 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"00c502ea503e8c21f2c7d1df2010a917580b418969a994f3e2aa4f910d4ae95c"} Dec 16 07:39:52 crc kubenswrapper[4789]: I1216 07:39:52.437233 4789 scope.go:117] "RemoveContainer" containerID="ec27149ea0ec807d701386202bde0a18509f707b927d9329ec8c51d031271eff" Dec 16 07:39:53 crc kubenswrapper[4789]: I1216 07:39:53.447345 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5"} Dec 16 07:42:21 crc kubenswrapper[4789]: I1216 07:42:21.927729 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:42:21 crc kubenswrapper[4789]: I1216 07:42:21.928337 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.793264 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wrt98"] Dec 16 07:42:29 crc kubenswrapper[4789]: E1216 07:42:29.793893 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="extract-content" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.793908 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="extract-content" Dec 16 07:42:29 crc kubenswrapper[4789]: E1216 07:42:29.793952 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="registry-server" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.793959 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="registry-server" Dec 16 07:42:29 crc kubenswrapper[4789]: E1216 07:42:29.793975 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="extract-utilities" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.793983 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="extract-utilities" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.794150 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="611f8b78-3ed5-4c6d-9522-ad167be21466" containerName="registry-server" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.795369 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.806157 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wrt98"] Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.903148 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-catalog-content\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.903209 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wvq7\" (UniqueName: \"kubernetes.io/projected/2413ab2d-849d-4301-a713-e5e7d4b5aa16-kube-api-access-5wvq7\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:29 crc kubenswrapper[4789]: I1216 07:42:29.903355 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-utilities\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.004238 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wvq7\" (UniqueName: \"kubernetes.io/projected/2413ab2d-849d-4301-a713-e5e7d4b5aa16-kube-api-access-5wvq7\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.004400 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-utilities\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.004849 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-utilities\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.005082 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-catalog-content\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.005353 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-catalog-content\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.028813 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wvq7\" (UniqueName: \"kubernetes.io/projected/2413ab2d-849d-4301-a713-e5e7d4b5aa16-kube-api-access-5wvq7\") pod \"certified-operators-wrt98\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.114411 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.368553 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wrt98"] Dec 16 07:42:30 crc kubenswrapper[4789]: I1216 07:42:30.446747 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrt98" event={"ID":"2413ab2d-849d-4301-a713-e5e7d4b5aa16","Type":"ContainerStarted","Data":"3222f2aaf1ffb31c39a61e9d5ee961280f20836fbc5002208a863e691430a01a"} Dec 16 07:42:31 crc kubenswrapper[4789]: I1216 07:42:31.456095 4789 generic.go:334] "Generic (PLEG): container finished" podID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerID="72238ce7e3c992619006c9090e0530b6909854b23b7a02c20bfdb4d3930e9122" exitCode=0 Dec 16 07:42:31 crc kubenswrapper[4789]: I1216 07:42:31.456140 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrt98" event={"ID":"2413ab2d-849d-4301-a713-e5e7d4b5aa16","Type":"ContainerDied","Data":"72238ce7e3c992619006c9090e0530b6909854b23b7a02c20bfdb4d3930e9122"} Dec 16 07:42:32 crc kubenswrapper[4789]: I1216 07:42:32.463649 4789 generic.go:334] "Generic (PLEG): container finished" podID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerID="4c6ee9ef9b1cafabdd3280ac6440ab8d16d7d0351e9ba874f47bf77c14850b3c" exitCode=0 Dec 16 07:42:32 crc kubenswrapper[4789]: I1216 07:42:32.463724 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrt98" event={"ID":"2413ab2d-849d-4301-a713-e5e7d4b5aa16","Type":"ContainerDied","Data":"4c6ee9ef9b1cafabdd3280ac6440ab8d16d7d0351e9ba874f47bf77c14850b3c"} Dec 16 07:42:33 crc kubenswrapper[4789]: I1216 07:42:33.472876 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrt98" event={"ID":"2413ab2d-849d-4301-a713-e5e7d4b5aa16","Type":"ContainerStarted","Data":"4cccbb23de8939384e587beefa71e52d08920e3a6b3fad586eeef8b7920aa65f"} Dec 16 07:42:33 crc kubenswrapper[4789]: I1216 07:42:33.494000 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wrt98" podStartSLOduration=3.044672426 podStartE2EDuration="4.493979743s" podCreationTimestamp="2025-12-16 07:42:29 +0000 UTC" firstStartedPulling="2025-12-16 07:42:31.45798032 +0000 UTC m=+3089.719867949" lastFinishedPulling="2025-12-16 07:42:32.907287637 +0000 UTC m=+3091.169175266" observedRunningTime="2025-12-16 07:42:33.490389575 +0000 UTC m=+3091.752277214" watchObservedRunningTime="2025-12-16 07:42:33.493979743 +0000 UTC m=+3091.755867382" Dec 16 07:42:40 crc kubenswrapper[4789]: I1216 07:42:40.115150 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:40 crc kubenswrapper[4789]: I1216 07:42:40.115678 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:40 crc kubenswrapper[4789]: I1216 07:42:40.166399 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:40 crc kubenswrapper[4789]: I1216 07:42:40.558520 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:40 crc kubenswrapper[4789]: I1216 07:42:40.610315 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wrt98"] Dec 16 07:42:42 crc kubenswrapper[4789]: I1216 07:42:42.527952 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wrt98" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="registry-server" containerID="cri-o://4cccbb23de8939384e587beefa71e52d08920e3a6b3fad586eeef8b7920aa65f" gracePeriod=2 Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.550701 4789 generic.go:334] "Generic (PLEG): container finished" podID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerID="4cccbb23de8939384e587beefa71e52d08920e3a6b3fad586eeef8b7920aa65f" exitCode=0 Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.551178 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrt98" event={"ID":"2413ab2d-849d-4301-a713-e5e7d4b5aa16","Type":"ContainerDied","Data":"4cccbb23de8939384e587beefa71e52d08920e3a6b3fad586eeef8b7920aa65f"} Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.703373 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.814574 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wvq7\" (UniqueName: \"kubernetes.io/projected/2413ab2d-849d-4301-a713-e5e7d4b5aa16-kube-api-access-5wvq7\") pod \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.814643 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-utilities\") pod \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.814668 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-catalog-content\") pod \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\" (UID: \"2413ab2d-849d-4301-a713-e5e7d4b5aa16\") " Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.816293 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-utilities" (OuterVolumeSpecName: "utilities") pod "2413ab2d-849d-4301-a713-e5e7d4b5aa16" (UID: "2413ab2d-849d-4301-a713-e5e7d4b5aa16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.821475 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2413ab2d-849d-4301-a713-e5e7d4b5aa16-kube-api-access-5wvq7" (OuterVolumeSpecName: "kube-api-access-5wvq7") pod "2413ab2d-849d-4301-a713-e5e7d4b5aa16" (UID: "2413ab2d-849d-4301-a713-e5e7d4b5aa16"). InnerVolumeSpecName "kube-api-access-5wvq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.867507 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2413ab2d-849d-4301-a713-e5e7d4b5aa16" (UID: "2413ab2d-849d-4301-a713-e5e7d4b5aa16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.916093 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wvq7\" (UniqueName: \"kubernetes.io/projected/2413ab2d-849d-4301-a713-e5e7d4b5aa16-kube-api-access-5wvq7\") on node \"crc\" DevicePath \"\"" Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.916147 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:42:44 crc kubenswrapper[4789]: I1216 07:42:44.916159 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2413ab2d-849d-4301-a713-e5e7d4b5aa16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:42:45 crc kubenswrapper[4789]: I1216 07:42:45.576161 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wrt98" event={"ID":"2413ab2d-849d-4301-a713-e5e7d4b5aa16","Type":"ContainerDied","Data":"3222f2aaf1ffb31c39a61e9d5ee961280f20836fbc5002208a863e691430a01a"} Dec 16 07:42:45 crc kubenswrapper[4789]: I1216 07:42:45.576218 4789 scope.go:117] "RemoveContainer" containerID="4cccbb23de8939384e587beefa71e52d08920e3a6b3fad586eeef8b7920aa65f" Dec 16 07:42:45 crc kubenswrapper[4789]: I1216 07:42:45.576242 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wrt98" Dec 16 07:42:45 crc kubenswrapper[4789]: I1216 07:42:45.595818 4789 scope.go:117] "RemoveContainer" containerID="4c6ee9ef9b1cafabdd3280ac6440ab8d16d7d0351e9ba874f47bf77c14850b3c" Dec 16 07:42:45 crc kubenswrapper[4789]: I1216 07:42:45.618095 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wrt98"] Dec 16 07:42:45 crc kubenswrapper[4789]: I1216 07:42:45.630401 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wrt98"] Dec 16 07:42:45 crc kubenswrapper[4789]: I1216 07:42:45.638301 4789 scope.go:117] "RemoveContainer" containerID="72238ce7e3c992619006c9090e0530b6909854b23b7a02c20bfdb4d3930e9122" Dec 16 07:42:46 crc kubenswrapper[4789]: I1216 07:42:46.114587 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" path="/var/lib/kubelet/pods/2413ab2d-849d-4301-a713-e5e7d4b5aa16/volumes" Dec 16 07:42:51 crc kubenswrapper[4789]: I1216 07:42:51.927752 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:42:51 crc kubenswrapper[4789]: I1216 07:42:51.928456 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:43:00 crc kubenswrapper[4789]: I1216 07:43:00.985973 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xqlj"] Dec 16 07:43:00 crc kubenswrapper[4789]: E1216 07:43:00.986648 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="extract-utilities" Dec 16 07:43:00 crc kubenswrapper[4789]: I1216 07:43:00.986662 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="extract-utilities" Dec 16 07:43:00 crc kubenswrapper[4789]: E1216 07:43:00.986678 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="extract-content" Dec 16 07:43:00 crc kubenswrapper[4789]: I1216 07:43:00.986684 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="extract-content" Dec 16 07:43:00 crc kubenswrapper[4789]: E1216 07:43:00.986694 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="registry-server" Dec 16 07:43:00 crc kubenswrapper[4789]: I1216 07:43:00.986700 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="registry-server" Dec 16 07:43:00 crc kubenswrapper[4789]: I1216 07:43:00.986844 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2413ab2d-849d-4301-a713-e5e7d4b5aa16" containerName="registry-server" Dec 16 07:43:00 crc kubenswrapper[4789]: I1216 07:43:00.988043 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.001443 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xqlj"] Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.101298 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-utilities\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.101379 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4st\" (UniqueName: \"kubernetes.io/projected/92324704-6820-47e8-831b-bb1f079bbc53-kube-api-access-4x4st\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.101419 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-catalog-content\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.202867 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4st\" (UniqueName: \"kubernetes.io/projected/92324704-6820-47e8-831b-bb1f079bbc53-kube-api-access-4x4st\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.202980 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-catalog-content\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.203057 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-utilities\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.203601 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-catalog-content\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.203631 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-utilities\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.232104 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4st\" (UniqueName: \"kubernetes.io/projected/92324704-6820-47e8-831b-bb1f079bbc53-kube-api-access-4x4st\") pod \"redhat-operators-4xqlj\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.305716 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:01 crc kubenswrapper[4789]: I1216 07:43:01.747545 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xqlj"] Dec 16 07:43:02 crc kubenswrapper[4789]: I1216 07:43:02.327180 4789 generic.go:334] "Generic (PLEG): container finished" podID="92324704-6820-47e8-831b-bb1f079bbc53" containerID="f35f04d09fe6ba59b6e3344ead0ef95e0e936446a11e8bea2b225b45f3d28fc7" exitCode=0 Dec 16 07:43:02 crc kubenswrapper[4789]: I1216 07:43:02.327240 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xqlj" event={"ID":"92324704-6820-47e8-831b-bb1f079bbc53","Type":"ContainerDied","Data":"f35f04d09fe6ba59b6e3344ead0ef95e0e936446a11e8bea2b225b45f3d28fc7"} Dec 16 07:43:02 crc kubenswrapper[4789]: I1216 07:43:02.327430 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xqlj" event={"ID":"92324704-6820-47e8-831b-bb1f079bbc53","Type":"ContainerStarted","Data":"dd201fdfd1441d2a6b44698f7a875fe77b7cad7321f09038b991eec961c61fc2"} Dec 16 07:43:02 crc kubenswrapper[4789]: I1216 07:43:02.329049 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:43:03 crc kubenswrapper[4789]: I1216 07:43:03.335459 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xqlj" event={"ID":"92324704-6820-47e8-831b-bb1f079bbc53","Type":"ContainerStarted","Data":"26a13248ab916a4090a05f600403bb6d1f8137f50372a09be9b44a281db52e1b"} Dec 16 07:43:04 crc kubenswrapper[4789]: I1216 07:43:04.343302 4789 generic.go:334] "Generic (PLEG): container finished" podID="92324704-6820-47e8-831b-bb1f079bbc53" containerID="26a13248ab916a4090a05f600403bb6d1f8137f50372a09be9b44a281db52e1b" exitCode=0 Dec 16 07:43:04 crc kubenswrapper[4789]: I1216 07:43:04.343600 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xqlj" event={"ID":"92324704-6820-47e8-831b-bb1f079bbc53","Type":"ContainerDied","Data":"26a13248ab916a4090a05f600403bb6d1f8137f50372a09be9b44a281db52e1b"} Dec 16 07:43:05 crc kubenswrapper[4789]: I1216 07:43:05.351547 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xqlj" event={"ID":"92324704-6820-47e8-831b-bb1f079bbc53","Type":"ContainerStarted","Data":"00112292a2247a2a2af4a32d61da2133b68a9a5b120d79e73f894b58e656e975"} Dec 16 07:43:05 crc kubenswrapper[4789]: I1216 07:43:05.377353 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xqlj" podStartSLOduration=2.720768911 podStartE2EDuration="5.377336472s" podCreationTimestamp="2025-12-16 07:43:00 +0000 UTC" firstStartedPulling="2025-12-16 07:43:02.328773301 +0000 UTC m=+3120.590660930" lastFinishedPulling="2025-12-16 07:43:04.985340862 +0000 UTC m=+3123.247228491" observedRunningTime="2025-12-16 07:43:05.373382566 +0000 UTC m=+3123.635270215" watchObservedRunningTime="2025-12-16 07:43:05.377336472 +0000 UTC m=+3123.639224101" Dec 16 07:43:11 crc kubenswrapper[4789]: I1216 07:43:11.306285 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:11 crc kubenswrapper[4789]: I1216 07:43:11.306650 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:11 crc kubenswrapper[4789]: I1216 07:43:11.346971 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:11 crc kubenswrapper[4789]: I1216 07:43:11.427289 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:11 crc kubenswrapper[4789]: I1216 07:43:11.577382 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xqlj"] Dec 16 07:43:13 crc kubenswrapper[4789]: I1216 07:43:13.398369 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4xqlj" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="registry-server" containerID="cri-o://00112292a2247a2a2af4a32d61da2133b68a9a5b120d79e73f894b58e656e975" gracePeriod=2 Dec 16 07:43:17 crc kubenswrapper[4789]: I1216 07:43:17.940094 4789 generic.go:334] "Generic (PLEG): container finished" podID="92324704-6820-47e8-831b-bb1f079bbc53" containerID="00112292a2247a2a2af4a32d61da2133b68a9a5b120d79e73f894b58e656e975" exitCode=0 Dec 16 07:43:17 crc kubenswrapper[4789]: I1216 07:43:17.940260 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xqlj" event={"ID":"92324704-6820-47e8-831b-bb1f079bbc53","Type":"ContainerDied","Data":"00112292a2247a2a2af4a32d61da2133b68a9a5b120d79e73f894b58e656e975"} Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.071702 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.231449 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4st\" (UniqueName: \"kubernetes.io/projected/92324704-6820-47e8-831b-bb1f079bbc53-kube-api-access-4x4st\") pod \"92324704-6820-47e8-831b-bb1f079bbc53\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.231508 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-utilities\") pod \"92324704-6820-47e8-831b-bb1f079bbc53\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.231611 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-catalog-content\") pod \"92324704-6820-47e8-831b-bb1f079bbc53\" (UID: \"92324704-6820-47e8-831b-bb1f079bbc53\") " Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.233218 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-utilities" (OuterVolumeSpecName: "utilities") pod "92324704-6820-47e8-831b-bb1f079bbc53" (UID: "92324704-6820-47e8-831b-bb1f079bbc53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.237903 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92324704-6820-47e8-831b-bb1f079bbc53-kube-api-access-4x4st" (OuterVolumeSpecName: "kube-api-access-4x4st") pod "92324704-6820-47e8-831b-bb1f079bbc53" (UID: "92324704-6820-47e8-831b-bb1f079bbc53"). InnerVolumeSpecName "kube-api-access-4x4st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.333616 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4st\" (UniqueName: \"kubernetes.io/projected/92324704-6820-47e8-831b-bb1f079bbc53-kube-api-access-4x4st\") on node \"crc\" DevicePath \"\"" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.333642 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.928706 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92324704-6820-47e8-831b-bb1f079bbc53" (UID: "92324704-6820-47e8-831b-bb1f079bbc53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.942143 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92324704-6820-47e8-831b-bb1f079bbc53-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.949041 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xqlj" event={"ID":"92324704-6820-47e8-831b-bb1f079bbc53","Type":"ContainerDied","Data":"dd201fdfd1441d2a6b44698f7a875fe77b7cad7321f09038b991eec961c61fc2"} Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.949071 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xqlj" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.949096 4789 scope.go:117] "RemoveContainer" containerID="00112292a2247a2a2af4a32d61da2133b68a9a5b120d79e73f894b58e656e975" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.966212 4789 scope.go:117] "RemoveContainer" containerID="26a13248ab916a4090a05f600403bb6d1f8137f50372a09be9b44a281db52e1b" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.989408 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xqlj"] Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.993238 4789 scope.go:117] "RemoveContainer" containerID="f35f04d09fe6ba59b6e3344ead0ef95e0e936446a11e8bea2b225b45f3d28fc7" Dec 16 07:43:18 crc kubenswrapper[4789]: I1216 07:43:18.996409 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4xqlj"] Dec 16 07:43:20 crc kubenswrapper[4789]: I1216 07:43:20.115812 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92324704-6820-47e8-831b-bb1f079bbc53" path="/var/lib/kubelet/pods/92324704-6820-47e8-831b-bb1f079bbc53/volumes" Dec 16 07:43:21 crc kubenswrapper[4789]: I1216 07:43:21.928289 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:43:21 crc kubenswrapper[4789]: I1216 07:43:21.929032 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:43:21 crc kubenswrapper[4789]: I1216 07:43:21.929173 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:43:21 crc kubenswrapper[4789]: I1216 07:43:21.930047 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:43:21 crc kubenswrapper[4789]: I1216 07:43:21.930189 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" gracePeriod=600 Dec 16 07:43:22 crc kubenswrapper[4789]: I1216 07:43:22.982425 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" exitCode=0 Dec 16 07:43:22 crc kubenswrapper[4789]: I1216 07:43:22.982484 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5"} Dec 16 07:43:22 crc kubenswrapper[4789]: I1216 07:43:22.982756 4789 scope.go:117] "RemoveContainer" containerID="00c502ea503e8c21f2c7d1df2010a917580b418969a994f3e2aa4f910d4ae95c" Dec 16 07:43:23 crc kubenswrapper[4789]: E1216 07:43:23.154110 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:43:23 crc kubenswrapper[4789]: I1216 07:43:23.991676 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:43:23 crc kubenswrapper[4789]: E1216 07:43:23.992091 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:43:36 crc kubenswrapper[4789]: I1216 07:43:36.104929 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:43:36 crc kubenswrapper[4789]: E1216 07:43:36.105666 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:43:47 crc kubenswrapper[4789]: I1216 07:43:47.104606 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:43:47 crc kubenswrapper[4789]: E1216 07:43:47.105565 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:44:00 crc kubenswrapper[4789]: I1216 07:44:00.105265 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:44:00 crc kubenswrapper[4789]: E1216 07:44:00.106092 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:44:13 crc kubenswrapper[4789]: I1216 07:44:13.105357 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:44:13 crc kubenswrapper[4789]: E1216 07:44:13.106007 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:44:28 crc kubenswrapper[4789]: I1216 07:44:28.105158 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:44:28 crc kubenswrapper[4789]: E1216 07:44:28.105947 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:44:28 crc kubenswrapper[4789]: I1216 07:44:28.220795 4789 scope.go:117] "RemoveContainer" containerID="ed86558d74009442f10654cb49f3413f43aa25d329c4360ce027fbe3c8d9fcc1" Dec 16 07:44:28 crc kubenswrapper[4789]: I1216 07:44:28.241589 4789 scope.go:117] "RemoveContainer" containerID="4d384e968104a060d1fb9a9bd6adf89d88f381861749bcba667c85cedc30559d" Dec 16 07:44:28 crc kubenswrapper[4789]: I1216 07:44:28.261283 4789 scope.go:117] "RemoveContainer" containerID="3e787cff94ef8dcc26f851fd4e9a3680a2fdac987b8879c65b292f61447a4774" Dec 16 07:44:41 crc kubenswrapper[4789]: I1216 07:44:41.104762 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:44:41 crc kubenswrapper[4789]: E1216 07:44:41.105476 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:44:54 crc kubenswrapper[4789]: I1216 07:44:54.104936 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:44:54 crc kubenswrapper[4789]: E1216 07:44:54.105449 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.151269 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn"] Dec 16 07:45:00 crc kubenswrapper[4789]: E1216 07:45:00.151985 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="registry-server" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.152003 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="registry-server" Dec 16 07:45:00 crc kubenswrapper[4789]: E1216 07:45:00.152018 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="extract-content" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.152025 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="extract-content" Dec 16 07:45:00 crc kubenswrapper[4789]: E1216 07:45:00.152058 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="extract-utilities" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.152065 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="extract-utilities" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.152306 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="92324704-6820-47e8-831b-bb1f079bbc53" containerName="registry-server" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.152800 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.155709 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.155794 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.162149 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn"] Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.210662 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b256111-1ac9-4f85-930e-4316e29c55fe-secret-volume\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.210762 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b256111-1ac9-4f85-930e-4316e29c55fe-config-volume\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.210797 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dptgh\" (UniqueName: \"kubernetes.io/projected/4b256111-1ac9-4f85-930e-4316e29c55fe-kube-api-access-dptgh\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.312119 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b256111-1ac9-4f85-930e-4316e29c55fe-config-volume\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.312180 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dptgh\" (UniqueName: \"kubernetes.io/projected/4b256111-1ac9-4f85-930e-4316e29c55fe-kube-api-access-dptgh\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.312223 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b256111-1ac9-4f85-930e-4316e29c55fe-secret-volume\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.313127 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b256111-1ac9-4f85-930e-4316e29c55fe-config-volume\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.321333 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b256111-1ac9-4f85-930e-4316e29c55fe-secret-volume\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.330569 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dptgh\" (UniqueName: \"kubernetes.io/projected/4b256111-1ac9-4f85-930e-4316e29c55fe-kube-api-access-dptgh\") pod \"collect-profiles-29431185-4t5tn\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.477873 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:00 crc kubenswrapper[4789]: I1216 07:45:00.911002 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn"] Dec 16 07:45:01 crc kubenswrapper[4789]: I1216 07:45:01.692157 4789 generic.go:334] "Generic (PLEG): container finished" podID="4b256111-1ac9-4f85-930e-4316e29c55fe" containerID="40bc6d14f1f90680efa2f89e6ad96d68bc5f0b2d972df11712cfe2e5ef89774f" exitCode=0 Dec 16 07:45:01 crc kubenswrapper[4789]: I1216 07:45:01.692224 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" event={"ID":"4b256111-1ac9-4f85-930e-4316e29c55fe","Type":"ContainerDied","Data":"40bc6d14f1f90680efa2f89e6ad96d68bc5f0b2d972df11712cfe2e5ef89774f"} Dec 16 07:45:01 crc kubenswrapper[4789]: I1216 07:45:01.692273 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" event={"ID":"4b256111-1ac9-4f85-930e-4316e29c55fe","Type":"ContainerStarted","Data":"d83caf25292dc107f0a26cfaaa4d9a50e44d1bc18f9d1c41e1fd4106481d26a4"} Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.006572 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.157741 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b256111-1ac9-4f85-930e-4316e29c55fe-config-volume\") pod \"4b256111-1ac9-4f85-930e-4316e29c55fe\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.158158 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dptgh\" (UniqueName: \"kubernetes.io/projected/4b256111-1ac9-4f85-930e-4316e29c55fe-kube-api-access-dptgh\") pod \"4b256111-1ac9-4f85-930e-4316e29c55fe\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.158199 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b256111-1ac9-4f85-930e-4316e29c55fe-secret-volume\") pod \"4b256111-1ac9-4f85-930e-4316e29c55fe\" (UID: \"4b256111-1ac9-4f85-930e-4316e29c55fe\") " Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.158569 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b256111-1ac9-4f85-930e-4316e29c55fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "4b256111-1ac9-4f85-930e-4316e29c55fe" (UID: "4b256111-1ac9-4f85-930e-4316e29c55fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.164178 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b256111-1ac9-4f85-930e-4316e29c55fe-kube-api-access-dptgh" (OuterVolumeSpecName: "kube-api-access-dptgh") pod "4b256111-1ac9-4f85-930e-4316e29c55fe" (UID: "4b256111-1ac9-4f85-930e-4316e29c55fe"). InnerVolumeSpecName "kube-api-access-dptgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.169161 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b256111-1ac9-4f85-930e-4316e29c55fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4b256111-1ac9-4f85-930e-4316e29c55fe" (UID: "4b256111-1ac9-4f85-930e-4316e29c55fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.260975 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b256111-1ac9-4f85-930e-4316e29c55fe-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.261377 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dptgh\" (UniqueName: \"kubernetes.io/projected/4b256111-1ac9-4f85-930e-4316e29c55fe-kube-api-access-dptgh\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.261517 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b256111-1ac9-4f85-930e-4316e29c55fe-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.706550 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" event={"ID":"4b256111-1ac9-4f85-930e-4316e29c55fe","Type":"ContainerDied","Data":"d83caf25292dc107f0a26cfaaa4d9a50e44d1bc18f9d1c41e1fd4106481d26a4"} Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.706598 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83caf25292dc107f0a26cfaaa4d9a50e44d1bc18f9d1c41e1fd4106481d26a4" Dec 16 07:45:03 crc kubenswrapper[4789]: I1216 07:45:03.706620 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn" Dec 16 07:45:04 crc kubenswrapper[4789]: I1216 07:45:04.088425 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m"] Dec 16 07:45:04 crc kubenswrapper[4789]: I1216 07:45:04.093480 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431140-hn84m"] Dec 16 07:45:04 crc kubenswrapper[4789]: I1216 07:45:04.112573 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31cf204-c244-4aac-954a-9ef9222209df" path="/var/lib/kubelet/pods/e31cf204-c244-4aac-954a-9ef9222209df/volumes" Dec 16 07:45:06 crc kubenswrapper[4789]: I1216 07:45:06.105337 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:45:06 crc kubenswrapper[4789]: E1216 07:45:06.105846 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:45:17 crc kubenswrapper[4789]: I1216 07:45:17.105248 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:45:17 crc kubenswrapper[4789]: E1216 07:45:17.105940 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:45:28 crc kubenswrapper[4789]: I1216 07:45:28.111429 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:45:28 crc kubenswrapper[4789]: E1216 07:45:28.112144 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:45:28 crc kubenswrapper[4789]: I1216 07:45:28.309609 4789 scope.go:117] "RemoveContainer" containerID="99adf9c5a0e69c36997dda840eed11aff22e9f0a4100d049a3c33332e3eb562e" Dec 16 07:45:42 crc kubenswrapper[4789]: I1216 07:45:42.108594 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:45:42 crc kubenswrapper[4789]: E1216 07:45:42.110742 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:45:57 crc kubenswrapper[4789]: I1216 07:45:57.106063 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:45:57 crc kubenswrapper[4789]: E1216 07:45:57.107407 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:46:11 crc kubenswrapper[4789]: I1216 07:46:11.105175 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:46:11 crc kubenswrapper[4789]: E1216 07:46:11.105958 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:46:23 crc kubenswrapper[4789]: I1216 07:46:23.105005 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:46:23 crc kubenswrapper[4789]: E1216 07:46:23.106262 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:46:37 crc kubenswrapper[4789]: I1216 07:46:37.104613 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:46:37 crc kubenswrapper[4789]: E1216 07:46:37.105313 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:46:49 crc kubenswrapper[4789]: I1216 07:46:49.104570 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:46:49 crc kubenswrapper[4789]: E1216 07:46:49.105213 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:47:02 crc kubenswrapper[4789]: I1216 07:47:02.108632 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:47:02 crc kubenswrapper[4789]: E1216 07:47:02.109366 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:47:15 crc kubenswrapper[4789]: I1216 07:47:15.105386 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:47:15 crc kubenswrapper[4789]: E1216 07:47:15.106096 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:47:29 crc kubenswrapper[4789]: I1216 07:47:29.105099 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:47:29 crc kubenswrapper[4789]: E1216 07:47:29.107044 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:47:42 crc kubenswrapper[4789]: I1216 07:47:42.112424 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:47:42 crc kubenswrapper[4789]: E1216 07:47:42.113408 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:47:54 crc kubenswrapper[4789]: I1216 07:47:54.106253 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:47:54 crc kubenswrapper[4789]: E1216 07:47:54.107413 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:48:09 crc kubenswrapper[4789]: I1216 07:48:09.104982 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:48:09 crc kubenswrapper[4789]: E1216 07:48:09.105698 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:48:23 crc kubenswrapper[4789]: I1216 07:48:23.105434 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:48:24 crc kubenswrapper[4789]: I1216 07:48:24.208333 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"48b20616493ffaf29236adc48e117e568c393563d1bffb585156662ae529052a"} Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.359303 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h7qv9"] Dec 16 07:49:17 crc kubenswrapper[4789]: E1216 07:49:17.360316 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b256111-1ac9-4f85-930e-4316e29c55fe" containerName="collect-profiles" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.360334 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b256111-1ac9-4f85-930e-4316e29c55fe" containerName="collect-profiles" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.360508 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b256111-1ac9-4f85-930e-4316e29c55fe" containerName="collect-profiles" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.361665 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.387470 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7qv9"] Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.465976 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-catalog-content\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.466049 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-utilities\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.466339 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdtj8\" (UniqueName: \"kubernetes.io/projected/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-kube-api-access-jdtj8\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.568329 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-catalog-content\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.568404 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-utilities\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.568467 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtj8\" (UniqueName: \"kubernetes.io/projected/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-kube-api-access-jdtj8\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.568866 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-catalog-content\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.568883 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-utilities\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.592151 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdtj8\" (UniqueName: \"kubernetes.io/projected/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-kube-api-access-jdtj8\") pod \"community-operators-h7qv9\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:17 crc kubenswrapper[4789]: I1216 07:49:17.686497 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:18 crc kubenswrapper[4789]: I1216 07:49:18.193892 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h7qv9"] Dec 16 07:49:18 crc kubenswrapper[4789]: I1216 07:49:18.628164 4789 generic.go:334] "Generic (PLEG): container finished" podID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerID="6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c" exitCode=0 Dec 16 07:49:18 crc kubenswrapper[4789]: I1216 07:49:18.628234 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7qv9" event={"ID":"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6","Type":"ContainerDied","Data":"6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c"} Dec 16 07:49:18 crc kubenswrapper[4789]: I1216 07:49:18.628507 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7qv9" event={"ID":"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6","Type":"ContainerStarted","Data":"0decf21abcc8083d8da7b9c4624bd969a0a9cee2e3bc7bdc36bce41e3e09cb62"} Dec 16 07:49:18 crc kubenswrapper[4789]: I1216 07:49:18.630668 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:49:20 crc kubenswrapper[4789]: I1216 07:49:20.644271 4789 generic.go:334] "Generic (PLEG): container finished" podID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerID="b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922" exitCode=0 Dec 16 07:49:20 crc kubenswrapper[4789]: I1216 07:49:20.644315 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7qv9" event={"ID":"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6","Type":"ContainerDied","Data":"b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922"} Dec 16 07:49:21 crc kubenswrapper[4789]: I1216 07:49:21.652206 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7qv9" event={"ID":"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6","Type":"ContainerStarted","Data":"ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308"} Dec 16 07:49:21 crc kubenswrapper[4789]: I1216 07:49:21.672393 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h7qv9" podStartSLOduration=2.115875185 podStartE2EDuration="4.672371474s" podCreationTimestamp="2025-12-16 07:49:17 +0000 UTC" firstStartedPulling="2025-12-16 07:49:18.630457746 +0000 UTC m=+3496.892345375" lastFinishedPulling="2025-12-16 07:49:21.186954035 +0000 UTC m=+3499.448841664" observedRunningTime="2025-12-16 07:49:21.667387194 +0000 UTC m=+3499.929274843" watchObservedRunningTime="2025-12-16 07:49:21.672371474 +0000 UTC m=+3499.934259103" Dec 16 07:49:27 crc kubenswrapper[4789]: I1216 07:49:27.687104 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:27 crc kubenswrapper[4789]: I1216 07:49:27.687538 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:27 crc kubenswrapper[4789]: I1216 07:49:27.738956 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:28 crc kubenswrapper[4789]: I1216 07:49:28.755492 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:28 crc kubenswrapper[4789]: I1216 07:49:28.813775 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7qv9"] Dec 16 07:49:30 crc kubenswrapper[4789]: I1216 07:49:30.717903 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h7qv9" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="registry-server" containerID="cri-o://ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308" gracePeriod=2 Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.160168 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.263938 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-utilities\") pod \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.263988 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-catalog-content\") pod \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.264054 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdtj8\" (UniqueName: \"kubernetes.io/projected/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-kube-api-access-jdtj8\") pod \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\" (UID: \"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6\") " Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.266325 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-utilities" (OuterVolumeSpecName: "utilities") pod "579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" (UID: "579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.276171 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-kube-api-access-jdtj8" (OuterVolumeSpecName: "kube-api-access-jdtj8") pod "579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" (UID: "579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6"). InnerVolumeSpecName "kube-api-access-jdtj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.321209 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" (UID: "579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.366728 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.366769 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.366784 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdtj8\" (UniqueName: \"kubernetes.io/projected/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6-kube-api-access-jdtj8\") on node \"crc\" DevicePath \"\"" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.726718 4789 generic.go:334] "Generic (PLEG): container finished" podID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerID="ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308" exitCode=0 Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.726783 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7qv9" event={"ID":"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6","Type":"ContainerDied","Data":"ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308"} Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.728093 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h7qv9" event={"ID":"579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6","Type":"ContainerDied","Data":"0decf21abcc8083d8da7b9c4624bd969a0a9cee2e3bc7bdc36bce41e3e09cb62"} Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.726792 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h7qv9" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.728128 4789 scope.go:117] "RemoveContainer" containerID="ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.747637 4789 scope.go:117] "RemoveContainer" containerID="b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.761052 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h7qv9"] Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.767892 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h7qv9"] Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.788531 4789 scope.go:117] "RemoveContainer" containerID="6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.808128 4789 scope.go:117] "RemoveContainer" containerID="ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308" Dec 16 07:49:31 crc kubenswrapper[4789]: E1216 07:49:31.808596 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308\": container with ID starting with ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308 not found: ID does not exist" containerID="ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.808652 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308"} err="failed to get container status \"ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308\": rpc error: code = NotFound desc = could not find container \"ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308\": container with ID starting with ce5d6dd0c7544a4fd7e70e2658e2c9859c093cb87611bf4849f7e535f1ba6308 not found: ID does not exist" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.808687 4789 scope.go:117] "RemoveContainer" containerID="b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922" Dec 16 07:49:31 crc kubenswrapper[4789]: E1216 07:49:31.809102 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922\": container with ID starting with b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922 not found: ID does not exist" containerID="b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.809144 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922"} err="failed to get container status \"b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922\": rpc error: code = NotFound desc = could not find container \"b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922\": container with ID starting with b7cb3d30aacd076a175f656df7560272c4891f72966aab260420fc356c084922 not found: ID does not exist" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.809187 4789 scope.go:117] "RemoveContainer" containerID="6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c" Dec 16 07:49:31 crc kubenswrapper[4789]: E1216 07:49:31.809514 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c\": container with ID starting with 6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c not found: ID does not exist" containerID="6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c" Dec 16 07:49:31 crc kubenswrapper[4789]: I1216 07:49:31.809563 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c"} err="failed to get container status \"6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c\": rpc error: code = NotFound desc = could not find container \"6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c\": container with ID starting with 6e59e4b39faabd4d7797a02fef962a5cf93cd0b2e3941850b7b1fba3f2a7859c not found: ID does not exist" Dec 16 07:49:32 crc kubenswrapper[4789]: I1216 07:49:32.112464 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" path="/var/lib/kubelet/pods/579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6/volumes" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.456930 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wglg7"] Dec 16 07:49:46 crc kubenswrapper[4789]: E1216 07:49:46.457761 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="extract-utilities" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.457776 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="extract-utilities" Dec 16 07:49:46 crc kubenswrapper[4789]: E1216 07:49:46.457794 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="extract-content" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.457802 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="extract-content" Dec 16 07:49:46 crc kubenswrapper[4789]: E1216 07:49:46.457818 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="registry-server" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.457829 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="registry-server" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.458061 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="579ecae5-1bc9-47ea-8c5a-dbe5cbdf0ac6" containerName="registry-server" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.459242 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.467189 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wglg7"] Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.481056 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-utilities\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.481093 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-catalog-content\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.481136 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9nrp\" (UniqueName: \"kubernetes.io/projected/af5bacaa-192a-46b5-8041-bc81999f0572-kube-api-access-h9nrp\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.581889 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9nrp\" (UniqueName: \"kubernetes.io/projected/af5bacaa-192a-46b5-8041-bc81999f0572-kube-api-access-h9nrp\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.581997 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-utilities\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.582022 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-catalog-content\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.582487 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-catalog-content\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.582603 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-utilities\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.600326 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9nrp\" (UniqueName: \"kubernetes.io/projected/af5bacaa-192a-46b5-8041-bc81999f0572-kube-api-access-h9nrp\") pod \"redhat-marketplace-wglg7\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:46 crc kubenswrapper[4789]: I1216 07:49:46.779218 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:47 crc kubenswrapper[4789]: I1216 07:49:47.205140 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wglg7"] Dec 16 07:49:47 crc kubenswrapper[4789]: I1216 07:49:47.839993 4789 generic.go:334] "Generic (PLEG): container finished" podID="af5bacaa-192a-46b5-8041-bc81999f0572" containerID="c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a" exitCode=0 Dec 16 07:49:47 crc kubenswrapper[4789]: I1216 07:49:47.840087 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wglg7" event={"ID":"af5bacaa-192a-46b5-8041-bc81999f0572","Type":"ContainerDied","Data":"c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a"} Dec 16 07:49:47 crc kubenswrapper[4789]: I1216 07:49:47.840348 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wglg7" event={"ID":"af5bacaa-192a-46b5-8041-bc81999f0572","Type":"ContainerStarted","Data":"a618c4f7628ccfea759856f043b16dfecd30d2851085a4bc49c7fd828687fcd6"} Dec 16 07:49:50 crc kubenswrapper[4789]: I1216 07:49:50.861528 4789 generic.go:334] "Generic (PLEG): container finished" podID="af5bacaa-192a-46b5-8041-bc81999f0572" containerID="e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729" exitCode=0 Dec 16 07:49:50 crc kubenswrapper[4789]: I1216 07:49:50.861599 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wglg7" event={"ID":"af5bacaa-192a-46b5-8041-bc81999f0572","Type":"ContainerDied","Data":"e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729"} Dec 16 07:49:51 crc kubenswrapper[4789]: I1216 07:49:51.876168 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wglg7" event={"ID":"af5bacaa-192a-46b5-8041-bc81999f0572","Type":"ContainerStarted","Data":"ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9"} Dec 16 07:49:51 crc kubenswrapper[4789]: I1216 07:49:51.895689 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wglg7" podStartSLOduration=2.390663187 podStartE2EDuration="5.895671375s" podCreationTimestamp="2025-12-16 07:49:46 +0000 UTC" firstStartedPulling="2025-12-16 07:49:47.845045456 +0000 UTC m=+3526.106933085" lastFinishedPulling="2025-12-16 07:49:51.350053644 +0000 UTC m=+3529.611941273" observedRunningTime="2025-12-16 07:49:51.895631994 +0000 UTC m=+3530.157519633" watchObservedRunningTime="2025-12-16 07:49:51.895671375 +0000 UTC m=+3530.157559004" Dec 16 07:49:56 crc kubenswrapper[4789]: I1216 07:49:56.779764 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:56 crc kubenswrapper[4789]: I1216 07:49:56.780076 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:56 crc kubenswrapper[4789]: I1216 07:49:56.819676 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:56 crc kubenswrapper[4789]: I1216 07:49:56.955182 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:57 crc kubenswrapper[4789]: I1216 07:49:57.056538 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wglg7"] Dec 16 07:49:58 crc kubenswrapper[4789]: I1216 07:49:58.925494 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wglg7" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="registry-server" containerID="cri-o://ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9" gracePeriod=2 Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.847173 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.932162 4789 generic.go:334] "Generic (PLEG): container finished" podID="af5bacaa-192a-46b5-8041-bc81999f0572" containerID="ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9" exitCode=0 Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.932223 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wglg7" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.932250 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wglg7" event={"ID":"af5bacaa-192a-46b5-8041-bc81999f0572","Type":"ContainerDied","Data":"ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9"} Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.932292 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wglg7" event={"ID":"af5bacaa-192a-46b5-8041-bc81999f0572","Type":"ContainerDied","Data":"a618c4f7628ccfea759856f043b16dfecd30d2851085a4bc49c7fd828687fcd6"} Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.932334 4789 scope.go:117] "RemoveContainer" containerID="ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.950098 4789 scope.go:117] "RemoveContainer" containerID="e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.964087 4789 scope.go:117] "RemoveContainer" containerID="c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.972566 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-catalog-content\") pod \"af5bacaa-192a-46b5-8041-bc81999f0572\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.972598 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-utilities\") pod \"af5bacaa-192a-46b5-8041-bc81999f0572\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.972679 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9nrp\" (UniqueName: \"kubernetes.io/projected/af5bacaa-192a-46b5-8041-bc81999f0572-kube-api-access-h9nrp\") pod \"af5bacaa-192a-46b5-8041-bc81999f0572\" (UID: \"af5bacaa-192a-46b5-8041-bc81999f0572\") " Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.973345 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-utilities" (OuterVolumeSpecName: "utilities") pod "af5bacaa-192a-46b5-8041-bc81999f0572" (UID: "af5bacaa-192a-46b5-8041-bc81999f0572"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.979768 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5bacaa-192a-46b5-8041-bc81999f0572-kube-api-access-h9nrp" (OuterVolumeSpecName: "kube-api-access-h9nrp") pod "af5bacaa-192a-46b5-8041-bc81999f0572" (UID: "af5bacaa-192a-46b5-8041-bc81999f0572"). InnerVolumeSpecName "kube-api-access-h9nrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.986992 4789 scope.go:117] "RemoveContainer" containerID="ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9" Dec 16 07:49:59 crc kubenswrapper[4789]: E1216 07:49:59.987318 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9\": container with ID starting with ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9 not found: ID does not exist" containerID="ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.987352 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9"} err="failed to get container status \"ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9\": rpc error: code = NotFound desc = could not find container \"ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9\": container with ID starting with ebbdeb063edb8cff2496af43fca46c86ffa671d1805a3bb63b11890a904ad0d9 not found: ID does not exist" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.987372 4789 scope.go:117] "RemoveContainer" containerID="e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729" Dec 16 07:49:59 crc kubenswrapper[4789]: E1216 07:49:59.987650 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729\": container with ID starting with e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729 not found: ID does not exist" containerID="e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.987694 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729"} err="failed to get container status \"e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729\": rpc error: code = NotFound desc = could not find container \"e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729\": container with ID starting with e96dce8eeb2f809dde8d484b5b66563acb89a67aea956739acdba53a0b8e9729 not found: ID does not exist" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.987727 4789 scope.go:117] "RemoveContainer" containerID="c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a" Dec 16 07:49:59 crc kubenswrapper[4789]: E1216 07:49:59.988004 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a\": container with ID starting with c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a not found: ID does not exist" containerID="c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.988026 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a"} err="failed to get container status \"c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a\": rpc error: code = NotFound desc = could not find container \"c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a\": container with ID starting with c31d91c7f168ed063f024da974be7f4104959a7385da4f3ca20853e97345776a not found: ID does not exist" Dec 16 07:49:59 crc kubenswrapper[4789]: I1216 07:49:59.997434 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af5bacaa-192a-46b5-8041-bc81999f0572" (UID: "af5bacaa-192a-46b5-8041-bc81999f0572"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:50:00 crc kubenswrapper[4789]: I1216 07:50:00.074358 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:50:00 crc kubenswrapper[4789]: I1216 07:50:00.074400 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af5bacaa-192a-46b5-8041-bc81999f0572-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:50:00 crc kubenswrapper[4789]: I1216 07:50:00.074410 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9nrp\" (UniqueName: \"kubernetes.io/projected/af5bacaa-192a-46b5-8041-bc81999f0572-kube-api-access-h9nrp\") on node \"crc\" DevicePath \"\"" Dec 16 07:50:00 crc kubenswrapper[4789]: I1216 07:50:00.254894 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wglg7"] Dec 16 07:50:00 crc kubenswrapper[4789]: I1216 07:50:00.259402 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wglg7"] Dec 16 07:50:02 crc kubenswrapper[4789]: I1216 07:50:02.113965 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" path="/var/lib/kubelet/pods/af5bacaa-192a-46b5-8041-bc81999f0572/volumes" Dec 16 07:50:51 crc kubenswrapper[4789]: I1216 07:50:51.928247 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:50:51 crc kubenswrapper[4789]: I1216 07:50:51.929174 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:51:21 crc kubenswrapper[4789]: I1216 07:51:21.928522 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:51:21 crc kubenswrapper[4789]: I1216 07:51:21.929148 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:51:51 crc kubenswrapper[4789]: I1216 07:51:51.927852 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:51:51 crc kubenswrapper[4789]: I1216 07:51:51.928521 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:51:51 crc kubenswrapper[4789]: I1216 07:51:51.928572 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:51:51 crc kubenswrapper[4789]: I1216 07:51:51.929251 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48b20616493ffaf29236adc48e117e568c393563d1bffb585156662ae529052a"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:51:51 crc kubenswrapper[4789]: I1216 07:51:51.929322 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://48b20616493ffaf29236adc48e117e568c393563d1bffb585156662ae529052a" gracePeriod=600 Dec 16 07:51:52 crc kubenswrapper[4789]: I1216 07:51:52.869173 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="48b20616493ffaf29236adc48e117e568c393563d1bffb585156662ae529052a" exitCode=0 Dec 16 07:51:52 crc kubenswrapper[4789]: I1216 07:51:52.869275 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"48b20616493ffaf29236adc48e117e568c393563d1bffb585156662ae529052a"} Dec 16 07:51:52 crc kubenswrapper[4789]: I1216 07:51:52.869780 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a"} Dec 16 07:51:52 crc kubenswrapper[4789]: I1216 07:51:52.869801 4789 scope.go:117] "RemoveContainer" containerID="7fda4fcec57a8af69bbe899d0c89b8d36f14246eef88e3e3c6ece189a938c7b5" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.371184 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lhkrd"] Dec 16 07:53:01 crc kubenswrapper[4789]: E1216 07:53:01.372420 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="extract-utilities" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.372449 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="extract-utilities" Dec 16 07:53:01 crc kubenswrapper[4789]: E1216 07:53:01.372481 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="extract-content" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.372528 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="extract-content" Dec 16 07:53:01 crc kubenswrapper[4789]: E1216 07:53:01.372591 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.372610 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.372965 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5bacaa-192a-46b5-8041-bc81999f0572" containerName="registry-server" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.375015 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.384942 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhkrd"] Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.520710 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssnpm\" (UniqueName: \"kubernetes.io/projected/030a76ed-9e8d-4e9b-9562-487d9cf615b4-kube-api-access-ssnpm\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.520789 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-catalog-content\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.520931 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-utilities\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.621767 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-catalog-content\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.621842 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-utilities\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.621897 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssnpm\" (UniqueName: \"kubernetes.io/projected/030a76ed-9e8d-4e9b-9562-487d9cf615b4-kube-api-access-ssnpm\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.622284 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-catalog-content\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.622761 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-utilities\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.647064 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssnpm\" (UniqueName: \"kubernetes.io/projected/030a76ed-9e8d-4e9b-9562-487d9cf615b4-kube-api-access-ssnpm\") pod \"certified-operators-lhkrd\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:01 crc kubenswrapper[4789]: I1216 07:53:01.705360 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:02 crc kubenswrapper[4789]: I1216 07:53:02.008189 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhkrd"] Dec 16 07:53:02 crc kubenswrapper[4789]: I1216 07:53:02.446133 4789 generic.go:334] "Generic (PLEG): container finished" podID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerID="811bb49d4737a47457f58a38deae21dbf627e49c9fb220419a1fbc77efa2c6a1" exitCode=0 Dec 16 07:53:02 crc kubenswrapper[4789]: I1216 07:53:02.446174 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhkrd" event={"ID":"030a76ed-9e8d-4e9b-9562-487d9cf615b4","Type":"ContainerDied","Data":"811bb49d4737a47457f58a38deae21dbf627e49c9fb220419a1fbc77efa2c6a1"} Dec 16 07:53:02 crc kubenswrapper[4789]: I1216 07:53:02.446473 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhkrd" event={"ID":"030a76ed-9e8d-4e9b-9562-487d9cf615b4","Type":"ContainerStarted","Data":"299d4ebd2c24f6eae2c215b6fb7a2bdd1e51d259339e338b5da3db9cbead02df"} Dec 16 07:53:03 crc kubenswrapper[4789]: I1216 07:53:03.454983 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhkrd" event={"ID":"030a76ed-9e8d-4e9b-9562-487d9cf615b4","Type":"ContainerStarted","Data":"12ff6b6bd090b974c451426861d7014e863b99924b642d56c3836fd301c73658"} Dec 16 07:53:04 crc kubenswrapper[4789]: I1216 07:53:04.465735 4789 generic.go:334] "Generic (PLEG): container finished" podID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerID="12ff6b6bd090b974c451426861d7014e863b99924b642d56c3836fd301c73658" exitCode=0 Dec 16 07:53:04 crc kubenswrapper[4789]: I1216 07:53:04.465818 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhkrd" event={"ID":"030a76ed-9e8d-4e9b-9562-487d9cf615b4","Type":"ContainerDied","Data":"12ff6b6bd090b974c451426861d7014e863b99924b642d56c3836fd301c73658"} Dec 16 07:53:05 crc kubenswrapper[4789]: I1216 07:53:05.474254 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhkrd" event={"ID":"030a76ed-9e8d-4e9b-9562-487d9cf615b4","Type":"ContainerStarted","Data":"82e1189e3a50170a1306ad67660fa50328d238000c83eeb516d59533a1a93622"} Dec 16 07:53:05 crc kubenswrapper[4789]: I1216 07:53:05.503238 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lhkrd" podStartSLOduration=1.818837415 podStartE2EDuration="4.50321657s" podCreationTimestamp="2025-12-16 07:53:01 +0000 UTC" firstStartedPulling="2025-12-16 07:53:02.447747565 +0000 UTC m=+3720.709635194" lastFinishedPulling="2025-12-16 07:53:05.1321267 +0000 UTC m=+3723.394014349" observedRunningTime="2025-12-16 07:53:05.49615817 +0000 UTC m=+3723.758045809" watchObservedRunningTime="2025-12-16 07:53:05.50321657 +0000 UTC m=+3723.765104199" Dec 16 07:53:11 crc kubenswrapper[4789]: I1216 07:53:11.706002 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:11 crc kubenswrapper[4789]: I1216 07:53:11.706809 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:11 crc kubenswrapper[4789]: I1216 07:53:11.779058 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:12 crc kubenswrapper[4789]: I1216 07:53:12.618749 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:12 crc kubenswrapper[4789]: I1216 07:53:12.671229 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhkrd"] Dec 16 07:53:14 crc kubenswrapper[4789]: I1216 07:53:14.569587 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lhkrd" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="registry-server" containerID="cri-o://82e1189e3a50170a1306ad67660fa50328d238000c83eeb516d59533a1a93622" gracePeriod=2 Dec 16 07:53:15 crc kubenswrapper[4789]: I1216 07:53:15.579781 4789 generic.go:334] "Generic (PLEG): container finished" podID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerID="82e1189e3a50170a1306ad67660fa50328d238000c83eeb516d59533a1a93622" exitCode=0 Dec 16 07:53:15 crc kubenswrapper[4789]: I1216 07:53:15.579825 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhkrd" event={"ID":"030a76ed-9e8d-4e9b-9562-487d9cf615b4","Type":"ContainerDied","Data":"82e1189e3a50170a1306ad67660fa50328d238000c83eeb516d59533a1a93622"} Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.121677 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.251841 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-catalog-content\") pod \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.251887 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssnpm\" (UniqueName: \"kubernetes.io/projected/030a76ed-9e8d-4e9b-9562-487d9cf615b4-kube-api-access-ssnpm\") pod \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.251983 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-utilities\") pod \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\" (UID: \"030a76ed-9e8d-4e9b-9562-487d9cf615b4\") " Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.254305 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-utilities" (OuterVolumeSpecName: "utilities") pod "030a76ed-9e8d-4e9b-9562-487d9cf615b4" (UID: "030a76ed-9e8d-4e9b-9562-487d9cf615b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.258323 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030a76ed-9e8d-4e9b-9562-487d9cf615b4-kube-api-access-ssnpm" (OuterVolumeSpecName: "kube-api-access-ssnpm") pod "030a76ed-9e8d-4e9b-9562-487d9cf615b4" (UID: "030a76ed-9e8d-4e9b-9562-487d9cf615b4"). InnerVolumeSpecName "kube-api-access-ssnpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.320638 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "030a76ed-9e8d-4e9b-9562-487d9cf615b4" (UID: "030a76ed-9e8d-4e9b-9562-487d9cf615b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.353405 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.353430 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssnpm\" (UniqueName: \"kubernetes.io/projected/030a76ed-9e8d-4e9b-9562-487d9cf615b4-kube-api-access-ssnpm\") on node \"crc\" DevicePath \"\"" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.353439 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/030a76ed-9e8d-4e9b-9562-487d9cf615b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.593195 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhkrd" event={"ID":"030a76ed-9e8d-4e9b-9562-487d9cf615b4","Type":"ContainerDied","Data":"299d4ebd2c24f6eae2c215b6fb7a2bdd1e51d259339e338b5da3db9cbead02df"} Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.593249 4789 scope.go:117] "RemoveContainer" containerID="82e1189e3a50170a1306ad67660fa50328d238000c83eeb516d59533a1a93622" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.593395 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhkrd" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.654343 4789 scope.go:117] "RemoveContainer" containerID="12ff6b6bd090b974c451426861d7014e863b99924b642d56c3836fd301c73658" Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.668420 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhkrd"] Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.684391 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lhkrd"] Dec 16 07:53:16 crc kubenswrapper[4789]: I1216 07:53:16.686272 4789 scope.go:117] "RemoveContainer" containerID="811bb49d4737a47457f58a38deae21dbf627e49c9fb220419a1fbc77efa2c6a1" Dec 16 07:53:18 crc kubenswrapper[4789]: I1216 07:53:18.117354 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" path="/var/lib/kubelet/pods/030a76ed-9e8d-4e9b-9562-487d9cf615b4/volumes" Dec 16 07:54:21 crc kubenswrapper[4789]: I1216 07:54:21.927674 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:54:21 crc kubenswrapper[4789]: I1216 07:54:21.929023 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:54:51 crc kubenswrapper[4789]: I1216 07:54:51.927837 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:54:51 crc kubenswrapper[4789]: I1216 07:54:51.929091 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:55:21 crc kubenswrapper[4789]: I1216 07:55:21.927742 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 07:55:21 crc kubenswrapper[4789]: I1216 07:55:21.928449 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 07:55:21 crc kubenswrapper[4789]: I1216 07:55:21.928510 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 07:55:21 crc kubenswrapper[4789]: I1216 07:55:21.929316 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 07:55:21 crc kubenswrapper[4789]: I1216 07:55:21.929408 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" gracePeriod=600 Dec 16 07:55:22 crc kubenswrapper[4789]: E1216 07:55:22.060780 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:55:22 crc kubenswrapper[4789]: I1216 07:55:22.496753 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" exitCode=0 Dec 16 07:55:22 crc kubenswrapper[4789]: I1216 07:55:22.496800 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a"} Dec 16 07:55:22 crc kubenswrapper[4789]: I1216 07:55:22.496847 4789 scope.go:117] "RemoveContainer" containerID="48b20616493ffaf29236adc48e117e568c393563d1bffb585156662ae529052a" Dec 16 07:55:22 crc kubenswrapper[4789]: I1216 07:55:22.498164 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:55:22 crc kubenswrapper[4789]: E1216 07:55:22.498481 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:55:36 crc kubenswrapper[4789]: I1216 07:55:36.105008 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:55:36 crc kubenswrapper[4789]: E1216 07:55:36.105755 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:55:48 crc kubenswrapper[4789]: I1216 07:55:48.105608 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:55:48 crc kubenswrapper[4789]: E1216 07:55:48.106352 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:56:03 crc kubenswrapper[4789]: I1216 07:56:03.105295 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:56:03 crc kubenswrapper[4789]: E1216 07:56:03.106048 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:56:16 crc kubenswrapper[4789]: I1216 07:56:16.106016 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:56:16 crc kubenswrapper[4789]: E1216 07:56:16.107158 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:56:31 crc kubenswrapper[4789]: I1216 07:56:31.105330 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:56:31 crc kubenswrapper[4789]: E1216 07:56:31.106240 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:56:43 crc kubenswrapper[4789]: I1216 07:56:43.105524 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:56:43 crc kubenswrapper[4789]: E1216 07:56:43.106519 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:56:54 crc kubenswrapper[4789]: I1216 07:56:54.105069 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:56:54 crc kubenswrapper[4789]: E1216 07:56:54.105891 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:57:07 crc kubenswrapper[4789]: I1216 07:57:07.104190 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:57:07 crc kubenswrapper[4789]: E1216 07:57:07.104827 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:57:19 crc kubenswrapper[4789]: I1216 07:57:19.105400 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:57:19 crc kubenswrapper[4789]: E1216 07:57:19.106408 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:57:33 crc kubenswrapper[4789]: I1216 07:57:33.105279 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:57:33 crc kubenswrapper[4789]: E1216 07:57:33.107307 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:57:45 crc kubenswrapper[4789]: I1216 07:57:45.105065 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:57:45 crc kubenswrapper[4789]: E1216 07:57:45.106115 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:57:59 crc kubenswrapper[4789]: I1216 07:57:59.104900 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:57:59 crc kubenswrapper[4789]: E1216 07:57:59.105493 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:58:13 crc kubenswrapper[4789]: I1216 07:58:13.105021 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:58:13 crc kubenswrapper[4789]: E1216 07:58:13.106017 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:58:26 crc kubenswrapper[4789]: I1216 07:58:26.104997 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:58:26 crc kubenswrapper[4789]: E1216 07:58:26.105940 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:58:37 crc kubenswrapper[4789]: I1216 07:58:37.105481 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:58:37 crc kubenswrapper[4789]: E1216 07:58:37.106723 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:58:49 crc kubenswrapper[4789]: I1216 07:58:49.105254 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:58:49 crc kubenswrapper[4789]: E1216 07:58:49.106122 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:59:03 crc kubenswrapper[4789]: I1216 07:59:03.104840 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:59:03 crc kubenswrapper[4789]: E1216 07:59:03.105636 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:59:14 crc kubenswrapper[4789]: I1216 07:59:14.105745 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:59:14 crc kubenswrapper[4789]: E1216 07:59:14.106899 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.536817 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wbhfk"] Dec 16 07:59:15 crc kubenswrapper[4789]: E1216 07:59:15.537467 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="registry-server" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.537484 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="registry-server" Dec 16 07:59:15 crc kubenswrapper[4789]: E1216 07:59:15.537508 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="extract-content" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.537518 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="extract-content" Dec 16 07:59:15 crc kubenswrapper[4789]: E1216 07:59:15.537538 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="extract-utilities" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.537547 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="extract-utilities" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.537739 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="030a76ed-9e8d-4e9b-9562-487d9cf615b4" containerName="registry-server" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.538942 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.555437 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbhfk"] Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.650662 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-catalog-content\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.650768 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/25f27e50-e318-4cdc-bc37-da51b2ca8e21-kube-api-access-876tw\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.650818 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-utilities\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.751905 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-utilities\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.752032 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-catalog-content\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.752100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/25f27e50-e318-4cdc-bc37-da51b2ca8e21-kube-api-access-876tw\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.752347 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-utilities\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.752711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-catalog-content\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.771480 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/25f27e50-e318-4cdc-bc37-da51b2ca8e21-kube-api-access-876tw\") pod \"redhat-operators-wbhfk\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:15 crc kubenswrapper[4789]: I1216 07:59:15.856540 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:16 crc kubenswrapper[4789]: I1216 07:59:16.289072 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbhfk"] Dec 16 07:59:17 crc kubenswrapper[4789]: I1216 07:59:17.266449 4789 generic.go:334] "Generic (PLEG): container finished" podID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerID="ee4728bb217d1897ad573bee1a2480cbfe74f0f36398f81feea836450a82cecd" exitCode=0 Dec 16 07:59:17 crc kubenswrapper[4789]: I1216 07:59:17.266521 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbhfk" event={"ID":"25f27e50-e318-4cdc-bc37-da51b2ca8e21","Type":"ContainerDied","Data":"ee4728bb217d1897ad573bee1a2480cbfe74f0f36398f81feea836450a82cecd"} Dec 16 07:59:17 crc kubenswrapper[4789]: I1216 07:59:17.267178 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbhfk" event={"ID":"25f27e50-e318-4cdc-bc37-da51b2ca8e21","Type":"ContainerStarted","Data":"1bb966fcd667623cec8f965bf844850a3a16535c0954433d6f231fa22508b566"} Dec 16 07:59:17 crc kubenswrapper[4789]: I1216 07:59:17.268180 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 07:59:18 crc kubenswrapper[4789]: I1216 07:59:18.272897 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbhfk" event={"ID":"25f27e50-e318-4cdc-bc37-da51b2ca8e21","Type":"ContainerStarted","Data":"01e67eada345a35ec20b866ba73540d848224262aa04b30a4aadd139894e48c3"} Dec 16 07:59:19 crc kubenswrapper[4789]: I1216 07:59:19.279866 4789 generic.go:334] "Generic (PLEG): container finished" podID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerID="01e67eada345a35ec20b866ba73540d848224262aa04b30a4aadd139894e48c3" exitCode=0 Dec 16 07:59:19 crc kubenswrapper[4789]: I1216 07:59:19.279949 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbhfk" event={"ID":"25f27e50-e318-4cdc-bc37-da51b2ca8e21","Type":"ContainerDied","Data":"01e67eada345a35ec20b866ba73540d848224262aa04b30a4aadd139894e48c3"} Dec 16 07:59:20 crc kubenswrapper[4789]: I1216 07:59:20.290861 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbhfk" event={"ID":"25f27e50-e318-4cdc-bc37-da51b2ca8e21","Type":"ContainerStarted","Data":"7cc38da8cb23956c22077bda01e8ac6f0d0a2168260413ca4b586e8e30fb1f75"} Dec 16 07:59:20 crc kubenswrapper[4789]: I1216 07:59:20.313583 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wbhfk" podStartSLOduration=2.610477694 podStartE2EDuration="5.313566412s" podCreationTimestamp="2025-12-16 07:59:15 +0000 UTC" firstStartedPulling="2025-12-16 07:59:17.267972646 +0000 UTC m=+4095.529860265" lastFinishedPulling="2025-12-16 07:59:19.971061354 +0000 UTC m=+4098.232948983" observedRunningTime="2025-12-16 07:59:20.311118373 +0000 UTC m=+4098.573006012" watchObservedRunningTime="2025-12-16 07:59:20.313566412 +0000 UTC m=+4098.575454041" Dec 16 07:59:25 crc kubenswrapper[4789]: I1216 07:59:25.857070 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:25 crc kubenswrapper[4789]: I1216 07:59:25.857469 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:25 crc kubenswrapper[4789]: I1216 07:59:25.898692 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:26 crc kubenswrapper[4789]: I1216 07:59:26.105285 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:59:26 crc kubenswrapper[4789]: E1216 07:59:26.105516 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:59:26 crc kubenswrapper[4789]: I1216 07:59:26.369031 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:26 crc kubenswrapper[4789]: I1216 07:59:26.409565 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbhfk"] Dec 16 07:59:28 crc kubenswrapper[4789]: I1216 07:59:28.343553 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wbhfk" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="registry-server" containerID="cri-o://7cc38da8cb23956c22077bda01e8ac6f0d0a2168260413ca4b586e8e30fb1f75" gracePeriod=2 Dec 16 07:59:30 crc kubenswrapper[4789]: I1216 07:59:30.356713 4789 generic.go:334] "Generic (PLEG): container finished" podID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerID="7cc38da8cb23956c22077bda01e8ac6f0d0a2168260413ca4b586e8e30fb1f75" exitCode=0 Dec 16 07:59:30 crc kubenswrapper[4789]: I1216 07:59:30.356773 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbhfk" event={"ID":"25f27e50-e318-4cdc-bc37-da51b2ca8e21","Type":"ContainerDied","Data":"7cc38da8cb23956c22077bda01e8ac6f0d0a2168260413ca4b586e8e30fb1f75"} Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.438766 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.485723 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/25f27e50-e318-4cdc-bc37-da51b2ca8e21-kube-api-access-876tw\") pod \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.485798 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-utilities\") pod \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.485947 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-catalog-content\") pod \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\" (UID: \"25f27e50-e318-4cdc-bc37-da51b2ca8e21\") " Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.486985 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-utilities" (OuterVolumeSpecName: "utilities") pod "25f27e50-e318-4cdc-bc37-da51b2ca8e21" (UID: "25f27e50-e318-4cdc-bc37-da51b2ca8e21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.491171 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f27e50-e318-4cdc-bc37-da51b2ca8e21-kube-api-access-876tw" (OuterVolumeSpecName: "kube-api-access-876tw") pod "25f27e50-e318-4cdc-bc37-da51b2ca8e21" (UID: "25f27e50-e318-4cdc-bc37-da51b2ca8e21"). InnerVolumeSpecName "kube-api-access-876tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.587686 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-876tw\" (UniqueName: \"kubernetes.io/projected/25f27e50-e318-4cdc-bc37-da51b2ca8e21-kube-api-access-876tw\") on node \"crc\" DevicePath \"\"" Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.587742 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.603876 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25f27e50-e318-4cdc-bc37-da51b2ca8e21" (UID: "25f27e50-e318-4cdc-bc37-da51b2ca8e21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 07:59:31 crc kubenswrapper[4789]: I1216 07:59:31.688930 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f27e50-e318-4cdc-bc37-da51b2ca8e21-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 07:59:32 crc kubenswrapper[4789]: I1216 07:59:32.373382 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbhfk" event={"ID":"25f27e50-e318-4cdc-bc37-da51b2ca8e21","Type":"ContainerDied","Data":"1bb966fcd667623cec8f965bf844850a3a16535c0954433d6f231fa22508b566"} Dec 16 07:59:32 crc kubenswrapper[4789]: I1216 07:59:32.373429 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbhfk" Dec 16 07:59:32 crc kubenswrapper[4789]: I1216 07:59:32.373728 4789 scope.go:117] "RemoveContainer" containerID="7cc38da8cb23956c22077bda01e8ac6f0d0a2168260413ca4b586e8e30fb1f75" Dec 16 07:59:32 crc kubenswrapper[4789]: I1216 07:59:32.409206 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbhfk"] Dec 16 07:59:32 crc kubenswrapper[4789]: I1216 07:59:32.415036 4789 scope.go:117] "RemoveContainer" containerID="01e67eada345a35ec20b866ba73540d848224262aa04b30a4aadd139894e48c3" Dec 16 07:59:32 crc kubenswrapper[4789]: I1216 07:59:32.421352 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wbhfk"] Dec 16 07:59:32 crc kubenswrapper[4789]: I1216 07:59:32.432412 4789 scope.go:117] "RemoveContainer" containerID="ee4728bb217d1897ad573bee1a2480cbfe74f0f36398f81feea836450a82cecd" Dec 16 07:59:34 crc kubenswrapper[4789]: I1216 07:59:34.113172 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" path="/var/lib/kubelet/pods/25f27e50-e318-4cdc-bc37-da51b2ca8e21/volumes" Dec 16 07:59:37 crc kubenswrapper[4789]: I1216 07:59:37.106166 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:59:37 crc kubenswrapper[4789]: E1216 07:59:37.107215 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:59:52 crc kubenswrapper[4789]: I1216 07:59:52.115485 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 07:59:52 crc kubenswrapper[4789]: E1216 07:59:52.117305 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.842108 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bhhpn"] Dec 16 07:59:59 crc kubenswrapper[4789]: E1216 07:59:59.842663 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="registry-server" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.842675 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="registry-server" Dec 16 07:59:59 crc kubenswrapper[4789]: E1216 07:59:59.842696 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="extract-utilities" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.842703 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="extract-utilities" Dec 16 07:59:59 crc kubenswrapper[4789]: E1216 07:59:59.842713 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="extract-content" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.842718 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="extract-content" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.842861 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f27e50-e318-4cdc-bc37-da51b2ca8e21" containerName="registry-server" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.843837 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.849332 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhhpn"] Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.871249 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-catalog-content\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.871310 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-utilities\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.871414 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpz8g\" (UniqueName: \"kubernetes.io/projected/b747f06b-9d72-45bf-8d14-bbb4b5b68583-kube-api-access-kpz8g\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.972570 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpz8g\" (UniqueName: \"kubernetes.io/projected/b747f06b-9d72-45bf-8d14-bbb4b5b68583-kube-api-access-kpz8g\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.972645 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-catalog-content\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.972713 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-utilities\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.973095 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-catalog-content\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.973215 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-utilities\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 07:59:59 crc kubenswrapper[4789]: I1216 07:59:59.991368 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpz8g\" (UniqueName: \"kubernetes.io/projected/b747f06b-9d72-45bf-8d14-bbb4b5b68583-kube-api-access-kpz8g\") pod \"redhat-marketplace-bhhpn\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.167126 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.180191 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl"] Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.181248 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.188572 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.188980 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.195493 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl"] Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.378630 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a191b34-d6cb-4afc-accf-4ec4ba9734af-config-volume\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.378993 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snth7\" (UniqueName: \"kubernetes.io/projected/2a191b34-d6cb-4afc-accf-4ec4ba9734af-kube-api-access-snth7\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.379018 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a191b34-d6cb-4afc-accf-4ec4ba9734af-secret-volume\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.480548 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a191b34-d6cb-4afc-accf-4ec4ba9734af-config-volume\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.480616 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snth7\" (UniqueName: \"kubernetes.io/projected/2a191b34-d6cb-4afc-accf-4ec4ba9734af-kube-api-access-snth7\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.480646 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a191b34-d6cb-4afc-accf-4ec4ba9734af-secret-volume\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.481420 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a191b34-d6cb-4afc-accf-4ec4ba9734af-config-volume\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.493145 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a191b34-d6cb-4afc-accf-4ec4ba9734af-secret-volume\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.496244 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snth7\" (UniqueName: \"kubernetes.io/projected/2a191b34-d6cb-4afc-accf-4ec4ba9734af-kube-api-access-snth7\") pod \"collect-profiles-29431200-rk6nl\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.562757 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.609409 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhhpn"] Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.767846 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhhpn" event={"ID":"b747f06b-9d72-45bf-8d14-bbb4b5b68583","Type":"ContainerStarted","Data":"36ddd3138f8b383b87cb25d706289e4aa55c9114134a3a570c02b95ff72d3cef"} Dec 16 08:00:00 crc kubenswrapper[4789]: I1216 08:00:00.966419 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl"] Dec 16 08:00:00 crc kubenswrapper[4789]: W1216 08:00:00.971891 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a191b34_d6cb_4afc_accf_4ec4ba9734af.slice/crio-79e186b80db1fa28432db3020007bcbdfb9c804c723a912af5b7e54b9dcd7613 WatchSource:0}: Error finding container 79e186b80db1fa28432db3020007bcbdfb9c804c723a912af5b7e54b9dcd7613: Status 404 returned error can't find the container with id 79e186b80db1fa28432db3020007bcbdfb9c804c723a912af5b7e54b9dcd7613 Dec 16 08:00:01 crc kubenswrapper[4789]: I1216 08:00:01.776612 4789 generic.go:334] "Generic (PLEG): container finished" podID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerID="f910f6ae04a9cb2f642989f26b44687a7cbf55b85d0eaff7c31547daac5e782c" exitCode=0 Dec 16 08:00:01 crc kubenswrapper[4789]: I1216 08:00:01.776697 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhhpn" event={"ID":"b747f06b-9d72-45bf-8d14-bbb4b5b68583","Type":"ContainerDied","Data":"f910f6ae04a9cb2f642989f26b44687a7cbf55b85d0eaff7c31547daac5e782c"} Dec 16 08:00:01 crc kubenswrapper[4789]: I1216 08:00:01.778331 4789 generic.go:334] "Generic (PLEG): container finished" podID="2a191b34-d6cb-4afc-accf-4ec4ba9734af" containerID="a7978195adc3ea814d11f59cee1995ab921d9071e34f799870d687f0e300a335" exitCode=0 Dec 16 08:00:01 crc kubenswrapper[4789]: I1216 08:00:01.778356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" event={"ID":"2a191b34-d6cb-4afc-accf-4ec4ba9734af","Type":"ContainerDied","Data":"a7978195adc3ea814d11f59cee1995ab921d9071e34f799870d687f0e300a335"} Dec 16 08:00:01 crc kubenswrapper[4789]: I1216 08:00:01.778378 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" event={"ID":"2a191b34-d6cb-4afc-accf-4ec4ba9734af","Type":"ContainerStarted","Data":"79e186b80db1fa28432db3020007bcbdfb9c804c723a912af5b7e54b9dcd7613"} Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.031353 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.120172 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a191b34-d6cb-4afc-accf-4ec4ba9734af-secret-volume\") pod \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.120218 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snth7\" (UniqueName: \"kubernetes.io/projected/2a191b34-d6cb-4afc-accf-4ec4ba9734af-kube-api-access-snth7\") pod \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.120251 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a191b34-d6cb-4afc-accf-4ec4ba9734af-config-volume\") pod \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\" (UID: \"2a191b34-d6cb-4afc-accf-4ec4ba9734af\") " Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.121375 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a191b34-d6cb-4afc-accf-4ec4ba9734af-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a191b34-d6cb-4afc-accf-4ec4ba9734af" (UID: "2a191b34-d6cb-4afc-accf-4ec4ba9734af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.125745 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a191b34-d6cb-4afc-accf-4ec4ba9734af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a191b34-d6cb-4afc-accf-4ec4ba9734af" (UID: "2a191b34-d6cb-4afc-accf-4ec4ba9734af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.126279 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a191b34-d6cb-4afc-accf-4ec4ba9734af-kube-api-access-snth7" (OuterVolumeSpecName: "kube-api-access-snth7") pod "2a191b34-d6cb-4afc-accf-4ec4ba9734af" (UID: "2a191b34-d6cb-4afc-accf-4ec4ba9734af"). InnerVolumeSpecName "kube-api-access-snth7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.221616 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a191b34-d6cb-4afc-accf-4ec4ba9734af-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.221667 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snth7\" (UniqueName: \"kubernetes.io/projected/2a191b34-d6cb-4afc-accf-4ec4ba9734af-kube-api-access-snth7\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.221682 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a191b34-d6cb-4afc-accf-4ec4ba9734af-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.796318 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" event={"ID":"2a191b34-d6cb-4afc-accf-4ec4ba9734af","Type":"ContainerDied","Data":"79e186b80db1fa28432db3020007bcbdfb9c804c723a912af5b7e54b9dcd7613"} Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.796358 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79e186b80db1fa28432db3020007bcbdfb9c804c723a912af5b7e54b9dcd7613" Dec 16 08:00:03 crc kubenswrapper[4789]: I1216 08:00:03.796410 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl" Dec 16 08:00:04 crc kubenswrapper[4789]: I1216 08:00:04.129621 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4"] Dec 16 08:00:04 crc kubenswrapper[4789]: I1216 08:00:04.136887 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431155-g4cb4"] Dec 16 08:00:04 crc kubenswrapper[4789]: I1216 08:00:04.806447 4789 generic.go:334] "Generic (PLEG): container finished" podID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerID="251cc6ce88a673e760f2e5e3f671596c6decbd5db7010ea71a22e3e6e34a8c5b" exitCode=0 Dec 16 08:00:04 crc kubenswrapper[4789]: I1216 08:00:04.806499 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhhpn" event={"ID":"b747f06b-9d72-45bf-8d14-bbb4b5b68583","Type":"ContainerDied","Data":"251cc6ce88a673e760f2e5e3f671596c6decbd5db7010ea71a22e3e6e34a8c5b"} Dec 16 08:00:05 crc kubenswrapper[4789]: I1216 08:00:05.105446 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 08:00:05 crc kubenswrapper[4789]: E1216 08:00:05.105840 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:00:05 crc kubenswrapper[4789]: I1216 08:00:05.821478 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhhpn" event={"ID":"b747f06b-9d72-45bf-8d14-bbb4b5b68583","Type":"ContainerStarted","Data":"8b037157dcd5d469d6e2774234f216e1d9413b95f7d8e363dddbbf6881efd27e"} Dec 16 08:00:05 crc kubenswrapper[4789]: I1216 08:00:05.842722 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bhhpn" podStartSLOduration=3.202112153 podStartE2EDuration="6.842696168s" podCreationTimestamp="2025-12-16 07:59:59 +0000 UTC" firstStartedPulling="2025-12-16 08:00:01.778569797 +0000 UTC m=+4140.040457456" lastFinishedPulling="2025-12-16 08:00:05.419153832 +0000 UTC m=+4143.681041471" observedRunningTime="2025-12-16 08:00:05.836923437 +0000 UTC m=+4144.098811066" watchObservedRunningTime="2025-12-16 08:00:05.842696168 +0000 UTC m=+4144.104583797" Dec 16 08:00:06 crc kubenswrapper[4789]: I1216 08:00:06.113246 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7e593c-0688-4ff7-b959-37f36a74aa2b" path="/var/lib/kubelet/pods/1e7e593c-0688-4ff7-b959-37f36a74aa2b/volumes" Dec 16 08:00:10 crc kubenswrapper[4789]: I1216 08:00:10.167523 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:10 crc kubenswrapper[4789]: I1216 08:00:10.169115 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:10 crc kubenswrapper[4789]: I1216 08:00:10.220573 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:10 crc kubenswrapper[4789]: I1216 08:00:10.917139 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:10 crc kubenswrapper[4789]: I1216 08:00:10.975218 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhhpn"] Dec 16 08:00:12 crc kubenswrapper[4789]: I1216 08:00:12.868264 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bhhpn" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="registry-server" containerID="cri-o://8b037157dcd5d469d6e2774234f216e1d9413b95f7d8e363dddbbf6881efd27e" gracePeriod=2 Dec 16 08:00:13 crc kubenswrapper[4789]: I1216 08:00:13.878242 4789 generic.go:334] "Generic (PLEG): container finished" podID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerID="8b037157dcd5d469d6e2774234f216e1d9413b95f7d8e363dddbbf6881efd27e" exitCode=0 Dec 16 08:00:13 crc kubenswrapper[4789]: I1216 08:00:13.878308 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhhpn" event={"ID":"b747f06b-9d72-45bf-8d14-bbb4b5b68583","Type":"ContainerDied","Data":"8b037157dcd5d469d6e2774234f216e1d9413b95f7d8e363dddbbf6881efd27e"} Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.050201 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.198798 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-catalog-content\") pod \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.198990 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-utilities\") pod \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.199129 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpz8g\" (UniqueName: \"kubernetes.io/projected/b747f06b-9d72-45bf-8d14-bbb4b5b68583-kube-api-access-kpz8g\") pod \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\" (UID: \"b747f06b-9d72-45bf-8d14-bbb4b5b68583\") " Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.200415 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-utilities" (OuterVolumeSpecName: "utilities") pod "b747f06b-9d72-45bf-8d14-bbb4b5b68583" (UID: "b747f06b-9d72-45bf-8d14-bbb4b5b68583"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.221095 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b747f06b-9d72-45bf-8d14-bbb4b5b68583" (UID: "b747f06b-9d72-45bf-8d14-bbb4b5b68583"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.222481 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b747f06b-9d72-45bf-8d14-bbb4b5b68583-kube-api-access-kpz8g" (OuterVolumeSpecName: "kube-api-access-kpz8g") pod "b747f06b-9d72-45bf-8d14-bbb4b5b68583" (UID: "b747f06b-9d72-45bf-8d14-bbb4b5b68583"). InnerVolumeSpecName "kube-api-access-kpz8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.301256 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.301298 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b747f06b-9d72-45bf-8d14-bbb4b5b68583-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.301309 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpz8g\" (UniqueName: \"kubernetes.io/projected/b747f06b-9d72-45bf-8d14-bbb4b5b68583-kube-api-access-kpz8g\") on node \"crc\" DevicePath \"\"" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.891281 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhhpn" event={"ID":"b747f06b-9d72-45bf-8d14-bbb4b5b68583","Type":"ContainerDied","Data":"36ddd3138f8b383b87cb25d706289e4aa55c9114134a3a570c02b95ff72d3cef"} Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.891352 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhhpn" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.891356 4789 scope.go:117] "RemoveContainer" containerID="8b037157dcd5d469d6e2774234f216e1d9413b95f7d8e363dddbbf6881efd27e" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.911727 4789 scope.go:117] "RemoveContainer" containerID="251cc6ce88a673e760f2e5e3f671596c6decbd5db7010ea71a22e3e6e34a8c5b" Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.924005 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhhpn"] Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.930899 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhhpn"] Dec 16 08:00:14 crc kubenswrapper[4789]: I1216 08:00:14.948736 4789 scope.go:117] "RemoveContainer" containerID="f910f6ae04a9cb2f642989f26b44687a7cbf55b85d0eaff7c31547daac5e782c" Dec 16 08:00:16 crc kubenswrapper[4789]: I1216 08:00:16.127949 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" path="/var/lib/kubelet/pods/b747f06b-9d72-45bf-8d14-bbb4b5b68583/volumes" Dec 16 08:00:19 crc kubenswrapper[4789]: I1216 08:00:19.105320 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 08:00:19 crc kubenswrapper[4789]: E1216 08:00:19.106274 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:00:28 crc kubenswrapper[4789]: I1216 08:00:28.821828 4789 scope.go:117] "RemoveContainer" containerID="99834c5c917095ba527996933ce5acb86ed694cf016c7bc85b9712ee416f3bd3" Dec 16 08:00:33 crc kubenswrapper[4789]: I1216 08:00:33.104904 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 08:00:34 crc kubenswrapper[4789]: I1216 08:00:34.040521 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"c2e924339a8b79f5acea702841f03d79960142f0749c3e2bfe47fc0008691ee8"} Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.624058 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-f9mt2"] Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.629749 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-f9mt2"] Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.798337 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-s2pbz"] Dec 16 08:01:36 crc kubenswrapper[4789]: E1216 08:01:36.798798 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="extract-utilities" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.798818 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="extract-utilities" Dec 16 08:01:36 crc kubenswrapper[4789]: E1216 08:01:36.798834 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="registry-server" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.798842 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="registry-server" Dec 16 08:01:36 crc kubenswrapper[4789]: E1216 08:01:36.798862 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a191b34-d6cb-4afc-accf-4ec4ba9734af" containerName="collect-profiles" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.798869 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a191b34-d6cb-4afc-accf-4ec4ba9734af" containerName="collect-profiles" Dec 16 08:01:36 crc kubenswrapper[4789]: E1216 08:01:36.798878 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="extract-content" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.798886 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="extract-content" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.799070 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b747f06b-9d72-45bf-8d14-bbb4b5b68583" containerName="registry-server" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.799090 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a191b34-d6cb-4afc-accf-4ec4ba9734af" containerName="collect-profiles" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.799648 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.802879 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.802966 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.803000 4789 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wmbjf" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.803052 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.812093 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-s2pbz"] Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.893671 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-node-mnt\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.894024 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppwtc\" (UniqueName: \"kubernetes.io/projected/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-kube-api-access-ppwtc\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.894546 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-crc-storage\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.996491 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-crc-storage\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.996652 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-node-mnt\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.996776 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppwtc\" (UniqueName: \"kubernetes.io/projected/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-kube-api-access-ppwtc\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.996947 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-node-mnt\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:36 crc kubenswrapper[4789]: I1216 08:01:36.997195 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-crc-storage\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:37 crc kubenswrapper[4789]: I1216 08:01:37.022094 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppwtc\" (UniqueName: \"kubernetes.io/projected/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-kube-api-access-ppwtc\") pod \"crc-storage-crc-s2pbz\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:37 crc kubenswrapper[4789]: I1216 08:01:37.131069 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:37 crc kubenswrapper[4789]: I1216 08:01:37.569297 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-s2pbz"] Dec 16 08:01:37 crc kubenswrapper[4789]: I1216 08:01:37.722104 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s2pbz" event={"ID":"64f5ade1-b87e-4987-a47c-8d7506bf1e5f","Type":"ContainerStarted","Data":"565e7925ba28377b53c3b3ebe4e5987a3347dfa27b09773608928989868da098"} Dec 16 08:01:38 crc kubenswrapper[4789]: I1216 08:01:38.127186 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889" path="/var/lib/kubelet/pods/6d9bb574-20b4-4d32-9ef5-b9b9c1fdb889/volumes" Dec 16 08:01:38 crc kubenswrapper[4789]: I1216 08:01:38.733590 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s2pbz" event={"ID":"64f5ade1-b87e-4987-a47c-8d7506bf1e5f","Type":"ContainerStarted","Data":"de1777f35536cdd7e78f94fa01ece14c847a47ed7a4f967c67f80f8b37023539"} Dec 16 08:01:38 crc kubenswrapper[4789]: I1216 08:01:38.754987 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-s2pbz" podStartSLOduration=1.931364624 podStartE2EDuration="2.754973199s" podCreationTimestamp="2025-12-16 08:01:36 +0000 UTC" firstStartedPulling="2025-12-16 08:01:37.588186078 +0000 UTC m=+4235.850073737" lastFinishedPulling="2025-12-16 08:01:38.411794683 +0000 UTC m=+4236.673682312" observedRunningTime="2025-12-16 08:01:38.750641703 +0000 UTC m=+4237.012529342" watchObservedRunningTime="2025-12-16 08:01:38.754973199 +0000 UTC m=+4237.016860828" Dec 16 08:01:39 crc kubenswrapper[4789]: I1216 08:01:39.744313 4789 generic.go:334] "Generic (PLEG): container finished" podID="64f5ade1-b87e-4987-a47c-8d7506bf1e5f" containerID="de1777f35536cdd7e78f94fa01ece14c847a47ed7a4f967c67f80f8b37023539" exitCode=0 Dec 16 08:01:39 crc kubenswrapper[4789]: I1216 08:01:39.744368 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s2pbz" event={"ID":"64f5ade1-b87e-4987-a47c-8d7506bf1e5f","Type":"ContainerDied","Data":"de1777f35536cdd7e78f94fa01ece14c847a47ed7a4f967c67f80f8b37023539"} Dec 16 08:01:40 crc kubenswrapper[4789]: I1216 08:01:40.982886 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.057367 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppwtc\" (UniqueName: \"kubernetes.io/projected/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-kube-api-access-ppwtc\") pod \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.057652 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-crc-storage\") pod \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.057681 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-node-mnt\") pod \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\" (UID: \"64f5ade1-b87e-4987-a47c-8d7506bf1e5f\") " Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.057858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "64f5ade1-b87e-4987-a47c-8d7506bf1e5f" (UID: "64f5ade1-b87e-4987-a47c-8d7506bf1e5f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.058014 4789 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.062863 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-kube-api-access-ppwtc" (OuterVolumeSpecName: "kube-api-access-ppwtc") pod "64f5ade1-b87e-4987-a47c-8d7506bf1e5f" (UID: "64f5ade1-b87e-4987-a47c-8d7506bf1e5f"). InnerVolumeSpecName "kube-api-access-ppwtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.079392 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "64f5ade1-b87e-4987-a47c-8d7506bf1e5f" (UID: "64f5ade1-b87e-4987-a47c-8d7506bf1e5f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.159270 4789 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.159305 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppwtc\" (UniqueName: \"kubernetes.io/projected/64f5ade1-b87e-4987-a47c-8d7506bf1e5f-kube-api-access-ppwtc\") on node \"crc\" DevicePath \"\"" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.757695 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s2pbz" event={"ID":"64f5ade1-b87e-4987-a47c-8d7506bf1e5f","Type":"ContainerDied","Data":"565e7925ba28377b53c3b3ebe4e5987a3347dfa27b09773608928989868da098"} Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.757980 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="565e7925ba28377b53c3b3ebe4e5987a3347dfa27b09773608928989868da098" Dec 16 08:01:41 crc kubenswrapper[4789]: I1216 08:01:41.757751 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s2pbz" Dec 16 08:01:42 crc kubenswrapper[4789]: I1216 08:01:42.967241 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-s2pbz"] Dec 16 08:01:42 crc kubenswrapper[4789]: I1216 08:01:42.973116 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-s2pbz"] Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.093955 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-c26jt"] Dec 16 08:01:43 crc kubenswrapper[4789]: E1216 08:01:43.094298 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f5ade1-b87e-4987-a47c-8d7506bf1e5f" containerName="storage" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.094311 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f5ade1-b87e-4987-a47c-8d7506bf1e5f" containerName="storage" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.094450 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f5ade1-b87e-4987-a47c-8d7506bf1e5f" containerName="storage" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.094881 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.097239 4789 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wmbjf" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.097349 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.097388 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.097538 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.103685 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-c26jt"] Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.188629 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqb2r\" (UniqueName: \"kubernetes.io/projected/091a4253-d1a0-4ef4-83ce-8d32895c5dac-kube-api-access-mqb2r\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.189150 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/091a4253-d1a0-4ef4-83ce-8d32895c5dac-node-mnt\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.189255 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/091a4253-d1a0-4ef4-83ce-8d32895c5dac-crc-storage\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.290334 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/091a4253-d1a0-4ef4-83ce-8d32895c5dac-crc-storage\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.290548 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqb2r\" (UniqueName: \"kubernetes.io/projected/091a4253-d1a0-4ef4-83ce-8d32895c5dac-kube-api-access-mqb2r\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.290634 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/091a4253-d1a0-4ef4-83ce-8d32895c5dac-node-mnt\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.290901 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/091a4253-d1a0-4ef4-83ce-8d32895c5dac-node-mnt\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.291223 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/091a4253-d1a0-4ef4-83ce-8d32895c5dac-crc-storage\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.308350 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqb2r\" (UniqueName: \"kubernetes.io/projected/091a4253-d1a0-4ef4-83ce-8d32895c5dac-kube-api-access-mqb2r\") pod \"crc-storage-crc-c26jt\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.430454 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:43 crc kubenswrapper[4789]: I1216 08:01:43.845305 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-c26jt"] Dec 16 08:01:44 crc kubenswrapper[4789]: I1216 08:01:44.114129 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f5ade1-b87e-4987-a47c-8d7506bf1e5f" path="/var/lib/kubelet/pods/64f5ade1-b87e-4987-a47c-8d7506bf1e5f/volumes" Dec 16 08:01:44 crc kubenswrapper[4789]: I1216 08:01:44.781376 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-c26jt" event={"ID":"091a4253-d1a0-4ef4-83ce-8d32895c5dac","Type":"ContainerStarted","Data":"d861e3096cc907a1aaed01d8c4bb336d1b3f6ac88078fbaf75c5b4c3cba1e57b"} Dec 16 08:01:45 crc kubenswrapper[4789]: I1216 08:01:45.789328 4789 generic.go:334] "Generic (PLEG): container finished" podID="091a4253-d1a0-4ef4-83ce-8d32895c5dac" containerID="f64dca07f11503f8b0c07af0dcd2732062491bd84e56a992cb49c0c2068f11ec" exitCode=0 Dec 16 08:01:45 crc kubenswrapper[4789]: I1216 08:01:45.789369 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-c26jt" event={"ID":"091a4253-d1a0-4ef4-83ce-8d32895c5dac","Type":"ContainerDied","Data":"f64dca07f11503f8b0c07af0dcd2732062491bd84e56a992cb49c0c2068f11ec"} Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.052374 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.137190 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqb2r\" (UniqueName: \"kubernetes.io/projected/091a4253-d1a0-4ef4-83ce-8d32895c5dac-kube-api-access-mqb2r\") pod \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.137354 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/091a4253-d1a0-4ef4-83ce-8d32895c5dac-node-mnt\") pod \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.137384 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/091a4253-d1a0-4ef4-83ce-8d32895c5dac-crc-storage\") pod \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\" (UID: \"091a4253-d1a0-4ef4-83ce-8d32895c5dac\") " Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.137512 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/091a4253-d1a0-4ef4-83ce-8d32895c5dac-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "091a4253-d1a0-4ef4-83ce-8d32895c5dac" (UID: "091a4253-d1a0-4ef4-83ce-8d32895c5dac"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.137847 4789 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/091a4253-d1a0-4ef4-83ce-8d32895c5dac-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.142663 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091a4253-d1a0-4ef4-83ce-8d32895c5dac-kube-api-access-mqb2r" (OuterVolumeSpecName: "kube-api-access-mqb2r") pod "091a4253-d1a0-4ef4-83ce-8d32895c5dac" (UID: "091a4253-d1a0-4ef4-83ce-8d32895c5dac"). InnerVolumeSpecName "kube-api-access-mqb2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.159048 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091a4253-d1a0-4ef4-83ce-8d32895c5dac-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "091a4253-d1a0-4ef4-83ce-8d32895c5dac" (UID: "091a4253-d1a0-4ef4-83ce-8d32895c5dac"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.239001 4789 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/091a4253-d1a0-4ef4-83ce-8d32895c5dac-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.239037 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqb2r\" (UniqueName: \"kubernetes.io/projected/091a4253-d1a0-4ef4-83ce-8d32895c5dac-kube-api-access-mqb2r\") on node \"crc\" DevicePath \"\"" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.807571 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-c26jt" event={"ID":"091a4253-d1a0-4ef4-83ce-8d32895c5dac","Type":"ContainerDied","Data":"d861e3096cc907a1aaed01d8c4bb336d1b3f6ac88078fbaf75c5b4c3cba1e57b"} Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.808575 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d861e3096cc907a1aaed01d8c4bb336d1b3f6ac88078fbaf75c5b4c3cba1e57b" Dec 16 08:01:47 crc kubenswrapper[4789]: I1216 08:01:47.808200 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c26jt" Dec 16 08:02:28 crc kubenswrapper[4789]: I1216 08:02:28.919607 4789 scope.go:117] "RemoveContainer" containerID="4671e6c07c62e8f147205567e01189d3299a3e899d46fe76d7ac1f98bcf71f74" Dec 16 08:02:51 crc kubenswrapper[4789]: I1216 08:02:51.928096 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:02:51 crc kubenswrapper[4789]: I1216 08:02:51.928691 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.314279 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5z6mr"] Dec 16 08:03:05 crc kubenswrapper[4789]: E1216 08:03:05.315431 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091a4253-d1a0-4ef4-83ce-8d32895c5dac" containerName="storage" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.315450 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="091a4253-d1a0-4ef4-83ce-8d32895c5dac" containerName="storage" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.315657 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="091a4253-d1a0-4ef4-83ce-8d32895c5dac" containerName="storage" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.316879 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.331998 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5z6mr"] Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.495046 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-utilities\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.495111 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxkzc\" (UniqueName: \"kubernetes.io/projected/3cbc1256-03cd-427e-b842-70574a829ac6-kube-api-access-wxkzc\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.495196 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-catalog-content\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.596774 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-catalog-content\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.596878 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-utilities\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.596929 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxkzc\" (UniqueName: \"kubernetes.io/projected/3cbc1256-03cd-427e-b842-70574a829ac6-kube-api-access-wxkzc\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.597722 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-catalog-content\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.598039 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-utilities\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.630562 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxkzc\" (UniqueName: \"kubernetes.io/projected/3cbc1256-03cd-427e-b842-70574a829ac6-kube-api-access-wxkzc\") pod \"certified-operators-5z6mr\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:05 crc kubenswrapper[4789]: I1216 08:03:05.643352 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:06 crc kubenswrapper[4789]: I1216 08:03:06.027274 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5z6mr"] Dec 16 08:03:06 crc kubenswrapper[4789]: I1216 08:03:06.456788 4789 generic.go:334] "Generic (PLEG): container finished" podID="3cbc1256-03cd-427e-b842-70574a829ac6" containerID="3dfc4097b19a0d36236b1673aa855ac1896641134cec68fc567ebf0264dee5f3" exitCode=0 Dec 16 08:03:06 crc kubenswrapper[4789]: I1216 08:03:06.456829 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5z6mr" event={"ID":"3cbc1256-03cd-427e-b842-70574a829ac6","Type":"ContainerDied","Data":"3dfc4097b19a0d36236b1673aa855ac1896641134cec68fc567ebf0264dee5f3"} Dec 16 08:03:06 crc kubenswrapper[4789]: I1216 08:03:06.456854 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5z6mr" event={"ID":"3cbc1256-03cd-427e-b842-70574a829ac6","Type":"ContainerStarted","Data":"b85a4a8160637b46d1862293e92900a0044af35467c5ba073b1e87bc283dc4ed"} Dec 16 08:03:08 crc kubenswrapper[4789]: I1216 08:03:08.472001 4789 generic.go:334] "Generic (PLEG): container finished" podID="3cbc1256-03cd-427e-b842-70574a829ac6" containerID="1e451543e39fae4aa7d8378397af96ba481385c100aa6ac0774a0c5490a0ab17" exitCode=0 Dec 16 08:03:08 crc kubenswrapper[4789]: I1216 08:03:08.472092 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5z6mr" event={"ID":"3cbc1256-03cd-427e-b842-70574a829ac6","Type":"ContainerDied","Data":"1e451543e39fae4aa7d8378397af96ba481385c100aa6ac0774a0c5490a0ab17"} Dec 16 08:03:09 crc kubenswrapper[4789]: I1216 08:03:09.483565 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5z6mr" event={"ID":"3cbc1256-03cd-427e-b842-70574a829ac6","Type":"ContainerStarted","Data":"526529eebdee032e212a0dc7c7fbf70e85f2669f45bd84d549fb8794abd4283f"} Dec 16 08:03:09 crc kubenswrapper[4789]: I1216 08:03:09.505690 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5z6mr" podStartSLOduration=2.028420207 podStartE2EDuration="4.50567118s" podCreationTimestamp="2025-12-16 08:03:05 +0000 UTC" firstStartedPulling="2025-12-16 08:03:06.458546842 +0000 UTC m=+4324.720434471" lastFinishedPulling="2025-12-16 08:03:08.935797815 +0000 UTC m=+4327.197685444" observedRunningTime="2025-12-16 08:03:09.502835601 +0000 UTC m=+4327.764723260" watchObservedRunningTime="2025-12-16 08:03:09.50567118 +0000 UTC m=+4327.767558819" Dec 16 08:03:15 crc kubenswrapper[4789]: I1216 08:03:15.644467 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:15 crc kubenswrapper[4789]: I1216 08:03:15.645605 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:15 crc kubenswrapper[4789]: I1216 08:03:15.705579 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:16 crc kubenswrapper[4789]: I1216 08:03:16.599508 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:16 crc kubenswrapper[4789]: I1216 08:03:16.647422 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5z6mr"] Dec 16 08:03:18 crc kubenswrapper[4789]: I1216 08:03:18.553577 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5z6mr" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="registry-server" containerID="cri-o://526529eebdee032e212a0dc7c7fbf70e85f2669f45bd84d549fb8794abd4283f" gracePeriod=2 Dec 16 08:03:19 crc kubenswrapper[4789]: I1216 08:03:19.566737 4789 generic.go:334] "Generic (PLEG): container finished" podID="3cbc1256-03cd-427e-b842-70574a829ac6" containerID="526529eebdee032e212a0dc7c7fbf70e85f2669f45bd84d549fb8794abd4283f" exitCode=0 Dec 16 08:03:19 crc kubenswrapper[4789]: I1216 08:03:19.566852 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5z6mr" event={"ID":"3cbc1256-03cd-427e-b842-70574a829ac6","Type":"ContainerDied","Data":"526529eebdee032e212a0dc7c7fbf70e85f2669f45bd84d549fb8794abd4283f"} Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.012194 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.138067 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-utilities\") pod \"3cbc1256-03cd-427e-b842-70574a829ac6\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.138252 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-catalog-content\") pod \"3cbc1256-03cd-427e-b842-70574a829ac6\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.138338 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkzc\" (UniqueName: \"kubernetes.io/projected/3cbc1256-03cd-427e-b842-70574a829ac6-kube-api-access-wxkzc\") pod \"3cbc1256-03cd-427e-b842-70574a829ac6\" (UID: \"3cbc1256-03cd-427e-b842-70574a829ac6\") " Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.138983 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-utilities" (OuterVolumeSpecName: "utilities") pod "3cbc1256-03cd-427e-b842-70574a829ac6" (UID: "3cbc1256-03cd-427e-b842-70574a829ac6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.143853 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cbc1256-03cd-427e-b842-70574a829ac6-kube-api-access-wxkzc" (OuterVolumeSpecName: "kube-api-access-wxkzc") pod "3cbc1256-03cd-427e-b842-70574a829ac6" (UID: "3cbc1256-03cd-427e-b842-70574a829ac6"). InnerVolumeSpecName "kube-api-access-wxkzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.190327 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cbc1256-03cd-427e-b842-70574a829ac6" (UID: "3cbc1256-03cd-427e-b842-70574a829ac6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.239760 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.239787 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cbc1256-03cd-427e-b842-70574a829ac6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.239798 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkzc\" (UniqueName: \"kubernetes.io/projected/3cbc1256-03cd-427e-b842-70574a829ac6-kube-api-access-wxkzc\") on node \"crc\" DevicePath \"\"" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.575937 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5z6mr" event={"ID":"3cbc1256-03cd-427e-b842-70574a829ac6","Type":"ContainerDied","Data":"b85a4a8160637b46d1862293e92900a0044af35467c5ba073b1e87bc283dc4ed"} Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.576001 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5z6mr" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.576005 4789 scope.go:117] "RemoveContainer" containerID="526529eebdee032e212a0dc7c7fbf70e85f2669f45bd84d549fb8794abd4283f" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.602448 4789 scope.go:117] "RemoveContainer" containerID="1e451543e39fae4aa7d8378397af96ba481385c100aa6ac0774a0c5490a0ab17" Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.613343 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5z6mr"] Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.620215 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5z6mr"] Dec 16 08:03:20 crc kubenswrapper[4789]: I1216 08:03:20.638044 4789 scope.go:117] "RemoveContainer" containerID="3dfc4097b19a0d36236b1673aa855ac1896641134cec68fc567ebf0264dee5f3" Dec 16 08:03:21 crc kubenswrapper[4789]: I1216 08:03:21.928819 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:03:21 crc kubenswrapper[4789]: I1216 08:03:21.928965 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:03:22 crc kubenswrapper[4789]: I1216 08:03:22.118430 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" path="/var/lib/kubelet/pods/3cbc1256-03cd-427e-b842-70574a829ac6/volumes" Dec 16 08:03:51 crc kubenswrapper[4789]: I1216 08:03:51.928291 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:03:51 crc kubenswrapper[4789]: I1216 08:03:51.929987 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:03:51 crc kubenswrapper[4789]: I1216 08:03:51.930131 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:03:51 crc kubenswrapper[4789]: I1216 08:03:51.930834 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2e924339a8b79f5acea702841f03d79960142f0749c3e2bfe47fc0008691ee8"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:03:51 crc kubenswrapper[4789]: I1216 08:03:51.930973 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://c2e924339a8b79f5acea702841f03d79960142f0749c3e2bfe47fc0008691ee8" gracePeriod=600 Dec 16 08:03:52 crc kubenswrapper[4789]: I1216 08:03:52.848489 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="c2e924339a8b79f5acea702841f03d79960142f0749c3e2bfe47fc0008691ee8" exitCode=0 Dec 16 08:03:52 crc kubenswrapper[4789]: I1216 08:03:52.848530 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"c2e924339a8b79f5acea702841f03d79960142f0749c3e2bfe47fc0008691ee8"} Dec 16 08:03:52 crc kubenswrapper[4789]: I1216 08:03:52.849220 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193"} Dec 16 08:03:52 crc kubenswrapper[4789]: I1216 08:03:52.849244 4789 scope.go:117] "RemoveContainer" containerID="f141601ea1c2c7150168f84b71d195fdb3922aee8fb4f293bc1b879f4deff43a" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.172504 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j9t7d"] Dec 16 08:04:28 crc kubenswrapper[4789]: E1216 08:04:28.173499 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="extract-utilities" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.173515 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="extract-utilities" Dec 16 08:04:28 crc kubenswrapper[4789]: E1216 08:04:28.173529 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="registry-server" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.173536 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="registry-server" Dec 16 08:04:28 crc kubenswrapper[4789]: E1216 08:04:28.173576 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="extract-content" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.173583 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="extract-content" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.173712 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cbc1256-03cd-427e-b842-70574a829ac6" containerName="registry-server" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.174722 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.179288 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j9t7d"] Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.357419 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-catalog-content\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.357511 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67cqm\" (UniqueName: \"kubernetes.io/projected/bf5d299c-6580-4eae-b462-5f16a36b00bd-kube-api-access-67cqm\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.357594 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-utilities\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.458664 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-catalog-content\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.458760 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67cqm\" (UniqueName: \"kubernetes.io/projected/bf5d299c-6580-4eae-b462-5f16a36b00bd-kube-api-access-67cqm\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.458848 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-utilities\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.459147 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-catalog-content\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.459397 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-utilities\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.485709 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67cqm\" (UniqueName: \"kubernetes.io/projected/bf5d299c-6580-4eae-b462-5f16a36b00bd-kube-api-access-67cqm\") pod \"community-operators-j9t7d\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:28 crc kubenswrapper[4789]: I1216 08:04:28.497079 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:29 crc kubenswrapper[4789]: I1216 08:04:28.999699 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j9t7d"] Dec 16 08:04:29 crc kubenswrapper[4789]: I1216 08:04:29.120141 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9t7d" event={"ID":"bf5d299c-6580-4eae-b462-5f16a36b00bd","Type":"ContainerStarted","Data":"096da547c4288fd954e083c9d02322f9f53e0d17258139e1dbe83aab23d6de8d"} Dec 16 08:04:30 crc kubenswrapper[4789]: I1216 08:04:30.130587 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerID="44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5" exitCode=0 Dec 16 08:04:30 crc kubenswrapper[4789]: I1216 08:04:30.130636 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9t7d" event={"ID":"bf5d299c-6580-4eae-b462-5f16a36b00bd","Type":"ContainerDied","Data":"44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5"} Dec 16 08:04:30 crc kubenswrapper[4789]: I1216 08:04:30.133629 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:04:32 crc kubenswrapper[4789]: I1216 08:04:32.146384 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerID="b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f" exitCode=0 Dec 16 08:04:32 crc kubenswrapper[4789]: I1216 08:04:32.146451 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9t7d" event={"ID":"bf5d299c-6580-4eae-b462-5f16a36b00bd","Type":"ContainerDied","Data":"b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f"} Dec 16 08:04:33 crc kubenswrapper[4789]: I1216 08:04:33.155988 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9t7d" event={"ID":"bf5d299c-6580-4eae-b462-5f16a36b00bd","Type":"ContainerStarted","Data":"dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213"} Dec 16 08:04:33 crc kubenswrapper[4789]: I1216 08:04:33.171643 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j9t7d" podStartSLOduration=2.411938189 podStartE2EDuration="5.171622891s" podCreationTimestamp="2025-12-16 08:04:28 +0000 UTC" firstStartedPulling="2025-12-16 08:04:30.13329179 +0000 UTC m=+4408.395179429" lastFinishedPulling="2025-12-16 08:04:32.892976502 +0000 UTC m=+4411.154864131" observedRunningTime="2025-12-16 08:04:33.170578436 +0000 UTC m=+4411.432466065" watchObservedRunningTime="2025-12-16 08:04:33.171622891 +0000 UTC m=+4411.433510520" Dec 16 08:04:38 crc kubenswrapper[4789]: I1216 08:04:38.497277 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:38 crc kubenswrapper[4789]: I1216 08:04:38.498111 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:38 crc kubenswrapper[4789]: I1216 08:04:38.539578 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:39 crc kubenswrapper[4789]: I1216 08:04:39.235689 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:39 crc kubenswrapper[4789]: I1216 08:04:39.275503 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j9t7d"] Dec 16 08:04:41 crc kubenswrapper[4789]: I1216 08:04:41.210797 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j9t7d" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="registry-server" containerID="cri-o://dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213" gracePeriod=2 Dec 16 08:04:41 crc kubenswrapper[4789]: I1216 08:04:41.821480 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:41 crc kubenswrapper[4789]: I1216 08:04:41.946120 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-utilities\") pod \"bf5d299c-6580-4eae-b462-5f16a36b00bd\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " Dec 16 08:04:41 crc kubenswrapper[4789]: I1216 08:04:41.946305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-catalog-content\") pod \"bf5d299c-6580-4eae-b462-5f16a36b00bd\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " Dec 16 08:04:41 crc kubenswrapper[4789]: I1216 08:04:41.946443 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67cqm\" (UniqueName: \"kubernetes.io/projected/bf5d299c-6580-4eae-b462-5f16a36b00bd-kube-api-access-67cqm\") pod \"bf5d299c-6580-4eae-b462-5f16a36b00bd\" (UID: \"bf5d299c-6580-4eae-b462-5f16a36b00bd\") " Dec 16 08:04:41 crc kubenswrapper[4789]: I1216 08:04:41.947417 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-utilities" (OuterVolumeSpecName: "utilities") pod "bf5d299c-6580-4eae-b462-5f16a36b00bd" (UID: "bf5d299c-6580-4eae-b462-5f16a36b00bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:04:41 crc kubenswrapper[4789]: I1216 08:04:41.952368 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5d299c-6580-4eae-b462-5f16a36b00bd-kube-api-access-67cqm" (OuterVolumeSpecName: "kube-api-access-67cqm") pod "bf5d299c-6580-4eae-b462-5f16a36b00bd" (UID: "bf5d299c-6580-4eae-b462-5f16a36b00bd"). InnerVolumeSpecName "kube-api-access-67cqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.048373 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67cqm\" (UniqueName: \"kubernetes.io/projected/bf5d299c-6580-4eae-b462-5f16a36b00bd-kube-api-access-67cqm\") on node \"crc\" DevicePath \"\"" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.048761 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.219642 4789 generic.go:334] "Generic (PLEG): container finished" podID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerID="dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213" exitCode=0 Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.219700 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9t7d" event={"ID":"bf5d299c-6580-4eae-b462-5f16a36b00bd","Type":"ContainerDied","Data":"dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213"} Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.219713 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j9t7d" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.219731 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j9t7d" event={"ID":"bf5d299c-6580-4eae-b462-5f16a36b00bd","Type":"ContainerDied","Data":"096da547c4288fd954e083c9d02322f9f53e0d17258139e1dbe83aab23d6de8d"} Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.219752 4789 scope.go:117] "RemoveContainer" containerID="dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.236699 4789 scope.go:117] "RemoveContainer" containerID="b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.249598 4789 scope.go:117] "RemoveContainer" containerID="44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.273903 4789 scope.go:117] "RemoveContainer" containerID="dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213" Dec 16 08:04:42 crc kubenswrapper[4789]: E1216 08:04:42.274716 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213\": container with ID starting with dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213 not found: ID does not exist" containerID="dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.274783 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213"} err="failed to get container status \"dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213\": rpc error: code = NotFound desc = could not find container \"dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213\": container with ID starting with dc66f9403c1a7ec1b9d85328f34f7b9c67d0614551f82de8841142a521c98213 not found: ID does not exist" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.274813 4789 scope.go:117] "RemoveContainer" containerID="b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f" Dec 16 08:04:42 crc kubenswrapper[4789]: E1216 08:04:42.275627 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f\": container with ID starting with b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f not found: ID does not exist" containerID="b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.275668 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f"} err="failed to get container status \"b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f\": rpc error: code = NotFound desc = could not find container \"b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f\": container with ID starting with b3599ca632ddd4ad51e65f22d58b67c2d71203a94ed74f161de6ed47d79ae31f not found: ID does not exist" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.275693 4789 scope.go:117] "RemoveContainer" containerID="44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5" Dec 16 08:04:42 crc kubenswrapper[4789]: E1216 08:04:42.275992 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5\": container with ID starting with 44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5 not found: ID does not exist" containerID="44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.276017 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5"} err="failed to get container status \"44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5\": rpc error: code = NotFound desc = could not find container \"44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5\": container with ID starting with 44592788ca1d5e390ec028a19156132158ced55d3d20ae9d086fe5335bed3af5 not found: ID does not exist" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.522799 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf5d299c-6580-4eae-b462-5f16a36b00bd" (UID: "bf5d299c-6580-4eae-b462-5f16a36b00bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.556074 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf5d299c-6580-4eae-b462-5f16a36b00bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.846341 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j9t7d"] Dec 16 08:04:42 crc kubenswrapper[4789]: I1216 08:04:42.856037 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j9t7d"] Dec 16 08:04:44 crc kubenswrapper[4789]: I1216 08:04:44.113318 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" path="/var/lib/kubelet/pods/bf5d299c-6580-4eae-b462-5f16a36b00bd/volumes" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.164428 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-ln5rf"] Dec 16 08:04:53 crc kubenswrapper[4789]: E1216 08:04:53.165189 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="extract-content" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.165202 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="extract-content" Dec 16 08:04:53 crc kubenswrapper[4789]: E1216 08:04:53.165218 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="registry-server" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.165223 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="registry-server" Dec 16 08:04:53 crc kubenswrapper[4789]: E1216 08:04:53.165249 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="extract-utilities" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.165272 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="extract-utilities" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.165486 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5d299c-6580-4eae-b462-5f16a36b00bd" containerName="registry-server" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.172236 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.175748 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.176177 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.176331 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.176568 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.197431 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jcqsk" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.204591 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-ln5rf"] Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.317770 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjj7g\" (UniqueName: \"kubernetes.io/projected/4920a698-9500-40bb-b5ba-0a9036ee5fcf-kube-api-access-xjj7g\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.317854 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-dns-svc\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.317901 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-config\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.407055 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57484c487-vtj5c"] Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.408097 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.419136 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjj7g\" (UniqueName: \"kubernetes.io/projected/4920a698-9500-40bb-b5ba-0a9036ee5fcf-kube-api-access-xjj7g\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.419217 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-dns-svc\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.419254 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-config\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.419146 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57484c487-vtj5c"] Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.420178 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-dns-svc\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.420227 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-config\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.447757 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjj7g\" (UniqueName: \"kubernetes.io/projected/4920a698-9500-40bb-b5ba-0a9036ee5fcf-kube-api-access-xjj7g\") pod \"dnsmasq-dns-6fdf89db6c-ln5rf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.520333 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7gqz\" (UniqueName: \"kubernetes.io/projected/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-kube-api-access-l7gqz\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.520396 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-dns-svc\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.520577 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-config\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.532003 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.621801 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-config\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.622198 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gqz\" (UniqueName: \"kubernetes.io/projected/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-kube-api-access-l7gqz\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.622239 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-dns-svc\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.622698 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-config\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.622925 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-dns-svc\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.670110 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7gqz\" (UniqueName: \"kubernetes.io/projected/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-kube-api-access-l7gqz\") pod \"dnsmasq-dns-57484c487-vtj5c\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:53 crc kubenswrapper[4789]: I1216 08:04:53.734576 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.067717 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-ln5rf"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.192686 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57484c487-vtj5c"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.260404 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.261672 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.263774 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.263811 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dl2b6" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.264084 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.264130 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.264145 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.287594 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.301078 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" event={"ID":"4920a698-9500-40bb-b5ba-0a9036ee5fcf","Type":"ContainerStarted","Data":"e51f67126dffc1f71af2a07c66a50f66c3d6217f981404b63aaf333c4f79a86c"} Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.302152 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-vtj5c" event={"ID":"4fa7df80-d58b-4152-a351-ab7e27f2e9d2","Type":"ContainerStarted","Data":"f64ae994115417ebd036ea42cb0586d67f02828c8043bb49d635b87744193f18"} Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.443946 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444013 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20764076-1e10-41bf-ad47-4879689fb282-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444038 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmqrv\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-kube-api-access-wmqrv\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444063 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444095 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444112 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444404 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444475 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.444509 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20764076-1e10-41bf-ad47-4879689fb282-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.545958 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.546052 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.546140 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.546199 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.546714 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.546971 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.547055 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.547169 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.547323 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20764076-1e10-41bf-ad47-4879689fb282-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.547450 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.547577 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20764076-1e10-41bf-ad47-4879689fb282-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.547654 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmqrv\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-kube-api-access-wmqrv\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.547714 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.550591 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.550687 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b609b3066bfbcfea006523a4f9903d81b2cd3d231d4ffc49326ec9c9c517e442/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.554847 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20764076-1e10-41bf-ad47-4879689fb282-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.554975 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20764076-1e10-41bf-ad47-4879689fb282-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.556371 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.569321 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmqrv\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-kube-api-access-wmqrv\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.571759 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.572869 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.578710 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.579377 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8rgz7" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.579762 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.580470 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.582823 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.587613 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.589076 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.650030 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.717840 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.719442 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.721365 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.728163 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.728433 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-plsx5" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.729008 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.732308 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.735202 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750189 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750229 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750258 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjtz\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-kube-api-access-zmjtz\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750278 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750299 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868049ed-5783-4da6-91b3-39954ca45bab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750313 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750328 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750356 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868049ed-5783-4da6-91b3-39954ca45bab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.750371 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851704 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851765 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851801 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmjtz\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-kube-api-access-zmjtz\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851829 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851855 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868049ed-5783-4da6-91b3-39954ca45bab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851873 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851895 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851944 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-config-data-default\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.851979 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852003 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868049ed-5783-4da6-91b3-39954ca45bab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852054 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852363 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-kolla-config\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852501 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npb9\" (UniqueName: \"kubernetes.io/projected/b040e505-3d77-42ec-b501-1b6fd0799640-kube-api-access-9npb9\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852649 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b040e505-3d77-42ec-b501-1b6fd0799640-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852802 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b040e505-3d77-42ec-b501-1b6fd0799640-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852843 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852882 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b040e505-3d77-42ec-b501-1b6fd0799640-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852954 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.852998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.853401 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.854863 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.857527 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868049ed-5783-4da6-91b3-39954ca45bab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.858679 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868049ed-5783-4da6-91b3-39954ca45bab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.858900 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.858952 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2f37d39606cb1deeffe1438d067eb417957ccefc73ea0d2a89b934fb2a08fd9/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.865787 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.871458 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmjtz\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-kube-api-access-zmjtz\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.895624 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.947529 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954470 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954505 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b040e505-3d77-42ec-b501-1b6fd0799640-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954545 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-config-data-default\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954584 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954626 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-kolla-config\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954643 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npb9\" (UniqueName: \"kubernetes.io/projected/b040e505-3d77-42ec-b501-1b6fd0799640-kube-api-access-9npb9\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954670 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b040e505-3d77-42ec-b501-1b6fd0799640-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.954699 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b040e505-3d77-42ec-b501-1b6fd0799640-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.955073 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b040e505-3d77-42ec-b501-1b6fd0799640-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.957607 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.957639 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3f0c05bc1ce1f3bf25d2675afe4312b0d1fcb8628b472d07463b7a8450397c37/globalmount\"" pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.958009 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.959117 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b040e505-3d77-42ec-b501-1b6fd0799640-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.959751 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-config-data-default\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.959988 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b040e505-3d77-42ec-b501-1b6fd0799640-kolla-config\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.962238 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b040e505-3d77-42ec-b501-1b6fd0799640-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.973664 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npb9\" (UniqueName: \"kubernetes.io/projected/b040e505-3d77-42ec-b501-1b6fd0799640-kube-api-access-9npb9\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:54 crc kubenswrapper[4789]: I1216 08:04:54.987121 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00a05d3a-c116-44e0-86f5-9e5c3d656175\") pod \"openstack-galera-0\" (UID: \"b040e505-3d77-42ec-b501-1b6fd0799640\") " pod="openstack/openstack-galera-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.070495 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.157583 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:04:55 crc kubenswrapper[4789]: W1216 08:04:55.204754 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20764076_1e10_41bf_ad47_4879689fb282.slice/crio-2c1358d4ae4d6fa23ba7d8fe0fe0c08d371041130932f45766c7a277e2118302 WatchSource:0}: Error finding container 2c1358d4ae4d6fa23ba7d8fe0fe0c08d371041130932f45766c7a277e2118302: Status 404 returned error can't find the container with id 2c1358d4ae4d6fa23ba7d8fe0fe0c08d371041130932f45766c7a277e2118302 Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.287749 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.292178 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.296420 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.307347 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-t4npp" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.314846 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.362670 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20764076-1e10-41bf-ad47-4879689fb282","Type":"ContainerStarted","Data":"2c1358d4ae4d6fa23ba7d8fe0fe0c08d371041130932f45766c7a277e2118302"} Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.468752 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e364136-8097-48a4-ad88-c3fc2967154d-config-data\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.468867 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcp8\" (UniqueName: \"kubernetes.io/projected/1e364136-8097-48a4-ad88-c3fc2967154d-kube-api-access-6wcp8\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.468894 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e364136-8097-48a4-ad88-c3fc2967154d-kolla-config\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.540558 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.570988 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e364136-8097-48a4-ad88-c3fc2967154d-kolla-config\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.571042 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e364136-8097-48a4-ad88-c3fc2967154d-config-data\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.571140 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcp8\" (UniqueName: \"kubernetes.io/projected/1e364136-8097-48a4-ad88-c3fc2967154d-kube-api-access-6wcp8\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.573012 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e364136-8097-48a4-ad88-c3fc2967154d-config-data\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.573434 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1e364136-8097-48a4-ad88-c3fc2967154d-kolla-config\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.585167 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:04:55 crc kubenswrapper[4789]: W1216 08:04:55.598613 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868049ed_5783_4da6_91b3_39954ca45bab.slice/crio-431f67cf54f0b78cd4cefdccb3029ebfa4627313b521f0c292be04fe75953fc8 WatchSource:0}: Error finding container 431f67cf54f0b78cd4cefdccb3029ebfa4627313b521f0c292be04fe75953fc8: Status 404 returned error can't find the container with id 431f67cf54f0b78cd4cefdccb3029ebfa4627313b521f0c292be04fe75953fc8 Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.605181 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcp8\" (UniqueName: \"kubernetes.io/projected/1e364136-8097-48a4-ad88-c3fc2967154d-kube-api-access-6wcp8\") pod \"memcached-0\" (UID: \"1e364136-8097-48a4-ad88-c3fc2967154d\") " pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.639967 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 08:04:55 crc kubenswrapper[4789]: I1216 08:04:55.871587 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 08:04:55 crc kubenswrapper[4789]: W1216 08:04:55.877070 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e364136_8097_48a4_ad88_c3fc2967154d.slice/crio-b66f7aad8dacc4d01a45fe8556adc4517ae6728e9d658d91804f6cda527e63d2 WatchSource:0}: Error finding container b66f7aad8dacc4d01a45fe8556adc4517ae6728e9d658d91804f6cda527e63d2: Status 404 returned error can't find the container with id b66f7aad8dacc4d01a45fe8556adc4517ae6728e9d658d91804f6cda527e63d2 Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.373693 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e364136-8097-48a4-ad88-c3fc2967154d","Type":"ContainerStarted","Data":"b66f7aad8dacc4d01a45fe8556adc4517ae6728e9d658d91804f6cda527e63d2"} Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.375116 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b040e505-3d77-42ec-b501-1b6fd0799640","Type":"ContainerStarted","Data":"de127c6d17a02a211469e013a9b3fa7399d49f071e2f2477e6431d5d5010f39b"} Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.377032 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868049ed-5783-4da6-91b3-39954ca45bab","Type":"ContainerStarted","Data":"431f67cf54f0b78cd4cefdccb3029ebfa4627313b521f0c292be04fe75953fc8"} Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.481112 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.482974 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.490735 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.517358 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.517593 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.518027 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.518212 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9j6p9" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.591946 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.591998 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.592049 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwsm2\" (UniqueName: \"kubernetes.io/projected/46553071-2569-4448-bd5c-f5862a4e71f5-kube-api-access-dwsm2\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.592086 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46553071-2569-4448-bd5c-f5862a4e71f5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.592111 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46553071-2569-4448-bd5c-f5862a4e71f5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.592144 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46553071-2569-4448-bd5c-f5862a4e71f5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.592182 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.592216 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693186 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46553071-2569-4448-bd5c-f5862a4e71f5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693252 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693276 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693327 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693351 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693386 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwsm2\" (UniqueName: \"kubernetes.io/projected/46553071-2569-4448-bd5c-f5862a4e71f5-kube-api-access-dwsm2\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693419 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46553071-2569-4448-bd5c-f5862a4e71f5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.693448 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46553071-2569-4448-bd5c-f5862a4e71f5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.696403 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.696607 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46553071-2569-4448-bd5c-f5862a4e71f5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.696712 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.698952 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46553071-2569-4448-bd5c-f5862a4e71f5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.699538 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.699578 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1a2a94028bd4e98636b141c98d082dc33124ee0a9bfd44bf5fa837152258460e/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.702718 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46553071-2569-4448-bd5c-f5862a4e71f5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.703223 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46553071-2569-4448-bd5c-f5862a4e71f5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.710652 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwsm2\" (UniqueName: \"kubernetes.io/projected/46553071-2569-4448-bd5c-f5862a4e71f5-kube-api-access-dwsm2\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.737591 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-de81b197-f355-4bcf-b202-38e80aa1b5be\") pod \"openstack-cell1-galera-0\" (UID: \"46553071-2569-4448-bd5c-f5862a4e71f5\") " pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:56 crc kubenswrapper[4789]: I1216 08:04:56.861888 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 08:04:57 crc kubenswrapper[4789]: I1216 08:04:57.372518 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 08:04:57 crc kubenswrapper[4789]: W1216 08:04:57.386400 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46553071_2569_4448_bd5c_f5862a4e71f5.slice/crio-0cc71d4c1e1bbf7e4d3b4f1c90233823bfebaf88c6aaba35ce14b978f7e87533 WatchSource:0}: Error finding container 0cc71d4c1e1bbf7e4d3b4f1c90233823bfebaf88c6aaba35ce14b978f7e87533: Status 404 returned error can't find the container with id 0cc71d4c1e1bbf7e4d3b4f1c90233823bfebaf88c6aaba35ce14b978f7e87533 Dec 16 08:04:58 crc kubenswrapper[4789]: I1216 08:04:58.393420 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46553071-2569-4448-bd5c-f5862a4e71f5","Type":"ContainerStarted","Data":"0cc71d4c1e1bbf7e4d3b4f1c90233823bfebaf88c6aaba35ce14b978f7e87533"} Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.849072 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.849763 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.849945 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dwsm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(46553071-2569-4448-bd5c-f5862a4e71f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.851295 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="46553071-2569-4448-bd5c-f5862a4e71f5" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.857096 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.857147 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.857297 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:c3a837a7c939c44c9106d2b2c7c72015,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmqrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(20764076-1e10-41bf-ad47-4879689fb282): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.859011 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="20764076-1e10-41bf-ad47-4879689fb282" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.859063 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.859135 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.859265 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9npb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(b040e505-3d77-42ec-b501-1b6fd0799640): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:05:19 crc kubenswrapper[4789]: E1216 08:05:19.861260 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="b040e505-3d77-42ec-b501-1b6fd0799640" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.429795 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.429850 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.429984 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h564h676h699hcdh67bh66hfdh569h545h648h94h546h696h668h89h96h667h575h595h5d9h584h8dhbdh697h54bhb7h58fh5c9hd8h5cdh5c7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7gqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57484c487-vtj5c_openstack(4fa7df80-d58b-4152-a351-ab7e27f2e9d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.431185 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57484c487-vtj5c" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.452569 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.452623 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.452727 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjj7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fdf89db6c-ln5rf_openstack(4920a698-9500-40bb-b5ba-0a9036ee5fcf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.454078 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" Dec 16 08:05:20 crc kubenswrapper[4789]: I1216 08:05:20.641290 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1e364136-8097-48a4-ad88-c3fc2967154d","Type":"ContainerStarted","Data":"1d574c574eda4f49d18d1661f2f3071c813df171ef4ca233a7b35c7bc57d42d3"} Dec 16 08:05:20 crc kubenswrapper[4789]: I1216 08:05:20.641737 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.643552 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.643638 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/dnsmasq-dns-57484c487-vtj5c" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.643690 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="46553071-2569-4448-bd5c-f5862a4e71f5" Dec 16 08:05:20 crc kubenswrapper[4789]: E1216 08:05:20.644263 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/openstack-galera-0" podUID="b040e505-3d77-42ec-b501-1b6fd0799640" Dec 16 08:05:20 crc kubenswrapper[4789]: I1216 08:05:20.731240 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.235008535 podStartE2EDuration="25.731219151s" podCreationTimestamp="2025-12-16 08:04:55 +0000 UTC" firstStartedPulling="2025-12-16 08:04:55.883810314 +0000 UTC m=+4434.145697943" lastFinishedPulling="2025-12-16 08:05:20.38002092 +0000 UTC m=+4458.641908559" observedRunningTime="2025-12-16 08:05:20.724526018 +0000 UTC m=+4458.986413657" watchObservedRunningTime="2025-12-16 08:05:20.731219151 +0000 UTC m=+4458.993106790" Dec 16 08:05:22 crc kubenswrapper[4789]: I1216 08:05:22.656226 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868049ed-5783-4da6-91b3-39954ca45bab","Type":"ContainerStarted","Data":"dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695"} Dec 16 08:05:22 crc kubenswrapper[4789]: I1216 08:05:22.660853 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20764076-1e10-41bf-ad47-4879689fb282","Type":"ContainerStarted","Data":"42b7c0f1608e472d03092ed54d07044d721c7ca1f35b1972eaa82e0aed56603b"} Dec 16 08:05:25 crc kubenswrapper[4789]: I1216 08:05:25.646696 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 16 08:05:31 crc kubenswrapper[4789]: I1216 08:05:31.737630 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b040e505-3d77-42ec-b501-1b6fd0799640","Type":"ContainerStarted","Data":"8f197757812965eec1573a6676622623c49d7e58255fd3f21b2cc3088a4abe43"} Dec 16 08:05:32 crc kubenswrapper[4789]: I1216 08:05:32.751969 4789 generic.go:334] "Generic (PLEG): container finished" podID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerID="8a72ade8d9cbabe879b43f8cb1a947701ae7a08e8e9864c3914dbb593458c690" exitCode=0 Dec 16 08:05:32 crc kubenswrapper[4789]: I1216 08:05:32.752214 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" event={"ID":"4920a698-9500-40bb-b5ba-0a9036ee5fcf","Type":"ContainerDied","Data":"8a72ade8d9cbabe879b43f8cb1a947701ae7a08e8e9864c3914dbb593458c690"} Dec 16 08:05:33 crc kubenswrapper[4789]: I1216 08:05:33.773768 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" event={"ID":"4920a698-9500-40bb-b5ba-0a9036ee5fcf","Type":"ContainerStarted","Data":"5deedf067f614e2bcf59b8efed5aa577e611c2d81e828327367f56dd57b913ca"} Dec 16 08:05:33 crc kubenswrapper[4789]: I1216 08:05:33.775298 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:05:33 crc kubenswrapper[4789]: I1216 08:05:33.798603 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" podStartSLOduration=2.567617218 podStartE2EDuration="40.798580738s" podCreationTimestamp="2025-12-16 08:04:53 +0000 UTC" firstStartedPulling="2025-12-16 08:04:54.083733079 +0000 UTC m=+4432.345620708" lastFinishedPulling="2025-12-16 08:05:32.314696599 +0000 UTC m=+4470.576584228" observedRunningTime="2025-12-16 08:05:33.79418973 +0000 UTC m=+4472.056077379" watchObservedRunningTime="2025-12-16 08:05:33.798580738 +0000 UTC m=+4472.060468377" Dec 16 08:05:34 crc kubenswrapper[4789]: I1216 08:05:34.782206 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46553071-2569-4448-bd5c-f5862a4e71f5","Type":"ContainerStarted","Data":"c58124e3c9229b7b520d223329d12a52947da4284839b0331ed7bf72ab8ba286"} Dec 16 08:05:35 crc kubenswrapper[4789]: I1216 08:05:35.791154 4789 generic.go:334] "Generic (PLEG): container finished" podID="b040e505-3d77-42ec-b501-1b6fd0799640" containerID="8f197757812965eec1573a6676622623c49d7e58255fd3f21b2cc3088a4abe43" exitCode=0 Dec 16 08:05:35 crc kubenswrapper[4789]: I1216 08:05:35.791218 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b040e505-3d77-42ec-b501-1b6fd0799640","Type":"ContainerDied","Data":"8f197757812965eec1573a6676622623c49d7e58255fd3f21b2cc3088a4abe43"} Dec 16 08:05:36 crc kubenswrapper[4789]: I1216 08:05:36.810436 4789 generic.go:334] "Generic (PLEG): container finished" podID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerID="78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032" exitCode=0 Dec 16 08:05:36 crc kubenswrapper[4789]: I1216 08:05:36.811432 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-vtj5c" event={"ID":"4fa7df80-d58b-4152-a351-ab7e27f2e9d2","Type":"ContainerDied","Data":"78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032"} Dec 16 08:05:36 crc kubenswrapper[4789]: I1216 08:05:36.822242 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b040e505-3d77-42ec-b501-1b6fd0799640","Type":"ContainerStarted","Data":"a81a33b9f901af36bc411a8708877ef2a79e4fd0b735437893175c31d2a8d77c"} Dec 16 08:05:36 crc kubenswrapper[4789]: I1216 08:05:36.881410 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.13461578 podStartE2EDuration="43.881386957s" podCreationTimestamp="2025-12-16 08:04:53 +0000 UTC" firstStartedPulling="2025-12-16 08:04:55.565262341 +0000 UTC m=+4433.827149970" lastFinishedPulling="2025-12-16 08:05:31.312033478 +0000 UTC m=+4469.573921147" observedRunningTime="2025-12-16 08:05:36.878184629 +0000 UTC m=+4475.140072258" watchObservedRunningTime="2025-12-16 08:05:36.881386957 +0000 UTC m=+4475.143274586" Dec 16 08:05:37 crc kubenswrapper[4789]: I1216 08:05:37.832604 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-vtj5c" event={"ID":"4fa7df80-d58b-4152-a351-ab7e27f2e9d2","Type":"ContainerStarted","Data":"c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f"} Dec 16 08:05:37 crc kubenswrapper[4789]: I1216 08:05:37.833363 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:05:37 crc kubenswrapper[4789]: I1216 08:05:37.856777 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57484c487-vtj5c" podStartSLOduration=-9223371991.998028 podStartE2EDuration="44.856747601s" podCreationTimestamp="2025-12-16 08:04:53 +0000 UTC" firstStartedPulling="2025-12-16 08:04:54.210010414 +0000 UTC m=+4432.471898043" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:05:37.851520583 +0000 UTC m=+4476.113408212" watchObservedRunningTime="2025-12-16 08:05:37.856747601 +0000 UTC m=+4476.118635240" Dec 16 08:05:38 crc kubenswrapper[4789]: I1216 08:05:38.533080 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:05:38 crc kubenswrapper[4789]: I1216 08:05:38.840489 4789 generic.go:334] "Generic (PLEG): container finished" podID="46553071-2569-4448-bd5c-f5862a4e71f5" containerID="c58124e3c9229b7b520d223329d12a52947da4284839b0331ed7bf72ab8ba286" exitCode=0 Dec 16 08:05:38 crc kubenswrapper[4789]: I1216 08:05:38.840626 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46553071-2569-4448-bd5c-f5862a4e71f5","Type":"ContainerDied","Data":"c58124e3c9229b7b520d223329d12a52947da4284839b0331ed7bf72ab8ba286"} Dec 16 08:05:39 crc kubenswrapper[4789]: I1216 08:05:39.849641 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46553071-2569-4448-bd5c-f5862a4e71f5","Type":"ContainerStarted","Data":"9ad1d7385dc9e7949c0bc5809f703b9b8635952e6fc09b8699aea16a4bb1baec"} Dec 16 08:05:39 crc kubenswrapper[4789]: I1216 08:05:39.873377 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371991.981415 podStartE2EDuration="44.873360707s" podCreationTimestamp="2025-12-16 08:04:55 +0000 UTC" firstStartedPulling="2025-12-16 08:04:57.389384924 +0000 UTC m=+4435.651272543" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:05:39.872602709 +0000 UTC m=+4478.134490348" watchObservedRunningTime="2025-12-16 08:05:39.873360707 +0000 UTC m=+4478.135248356" Dec 16 08:05:43 crc kubenswrapper[4789]: I1216 08:05:43.737131 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:05:43 crc kubenswrapper[4789]: I1216 08:05:43.782893 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-ln5rf"] Dec 16 08:05:43 crc kubenswrapper[4789]: I1216 08:05:43.783134 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerName="dnsmasq-dns" containerID="cri-o://5deedf067f614e2bcf59b8efed5aa577e611c2d81e828327367f56dd57b913ca" gracePeriod=10 Dec 16 08:05:44 crc kubenswrapper[4789]: I1216 08:05:44.895930 4789 generic.go:334] "Generic (PLEG): container finished" podID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerID="5deedf067f614e2bcf59b8efed5aa577e611c2d81e828327367f56dd57b913ca" exitCode=0 Dec 16 08:05:44 crc kubenswrapper[4789]: I1216 08:05:44.896475 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" event={"ID":"4920a698-9500-40bb-b5ba-0a9036ee5fcf","Type":"ContainerDied","Data":"5deedf067f614e2bcf59b8efed5aa577e611c2d81e828327367f56dd57b913ca"} Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.071685 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.071769 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.524205 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.692513 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjj7g\" (UniqueName: \"kubernetes.io/projected/4920a698-9500-40bb-b5ba-0a9036ee5fcf-kube-api-access-xjj7g\") pod \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.692565 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-dns-svc\") pod \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.692659 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-config\") pod \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\" (UID: \"4920a698-9500-40bb-b5ba-0a9036ee5fcf\") " Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.701858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4920a698-9500-40bb-b5ba-0a9036ee5fcf-kube-api-access-xjj7g" (OuterVolumeSpecName: "kube-api-access-xjj7g") pod "4920a698-9500-40bb-b5ba-0a9036ee5fcf" (UID: "4920a698-9500-40bb-b5ba-0a9036ee5fcf"). InnerVolumeSpecName "kube-api-access-xjj7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.737990 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-config" (OuterVolumeSpecName: "config") pod "4920a698-9500-40bb-b5ba-0a9036ee5fcf" (UID: "4920a698-9500-40bb-b5ba-0a9036ee5fcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.746317 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.747268 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4920a698-9500-40bb-b5ba-0a9036ee5fcf" (UID: "4920a698-9500-40bb-b5ba-0a9036ee5fcf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.794899 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.794959 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjj7g\" (UniqueName: \"kubernetes.io/projected/4920a698-9500-40bb-b5ba-0a9036ee5fcf-kube-api-access-xjj7g\") on node \"crc\" DevicePath \"\"" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.794970 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4920a698-9500-40bb-b5ba-0a9036ee5fcf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.907540 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" event={"ID":"4920a698-9500-40bb-b5ba-0a9036ee5fcf","Type":"ContainerDied","Data":"e51f67126dffc1f71af2a07c66a50f66c3d6217f981404b63aaf333c4f79a86c"} Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.907627 4789 scope.go:117] "RemoveContainer" containerID="5deedf067f614e2bcf59b8efed5aa577e611c2d81e828327367f56dd57b913ca" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.907867 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fdf89db6c-ln5rf" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.939761 4789 scope.go:117] "RemoveContainer" containerID="8a72ade8d9cbabe879b43f8cb1a947701ae7a08e8e9864c3914dbb593458c690" Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.962531 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-ln5rf"] Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.968226 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fdf89db6c-ln5rf"] Dec 16 08:05:45 crc kubenswrapper[4789]: I1216 08:05:45.998922 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 16 08:05:46 crc kubenswrapper[4789]: I1216 08:05:46.115215 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" path="/var/lib/kubelet/pods/4920a698-9500-40bb-b5ba-0a9036ee5fcf/volumes" Dec 16 08:05:46 crc kubenswrapper[4789]: I1216 08:05:46.862679 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 16 08:05:46 crc kubenswrapper[4789]: I1216 08:05:46.863073 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 16 08:05:49 crc kubenswrapper[4789]: I1216 08:05:49.048880 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 16 08:05:49 crc kubenswrapper[4789]: I1216 08:05:49.115788 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 16 08:05:55 crc kubenswrapper[4789]: I1216 08:05:55.974033 4789 generic.go:334] "Generic (PLEG): container finished" podID="868049ed-5783-4da6-91b3-39954ca45bab" containerID="dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695" exitCode=0 Dec 16 08:05:55 crc kubenswrapper[4789]: I1216 08:05:55.974119 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868049ed-5783-4da6-91b3-39954ca45bab","Type":"ContainerDied","Data":"dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695"} Dec 16 08:05:55 crc kubenswrapper[4789]: I1216 08:05:55.975945 4789 generic.go:334] "Generic (PLEG): container finished" podID="20764076-1e10-41bf-ad47-4879689fb282" containerID="42b7c0f1608e472d03092ed54d07044d721c7ca1f35b1972eaa82e0aed56603b" exitCode=0 Dec 16 08:05:55 crc kubenswrapper[4789]: I1216 08:05:55.975966 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20764076-1e10-41bf-ad47-4879689fb282","Type":"ContainerDied","Data":"42b7c0f1608e472d03092ed54d07044d721c7ca1f35b1972eaa82e0aed56603b"} Dec 16 08:05:56 crc kubenswrapper[4789]: I1216 08:05:56.984303 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20764076-1e10-41bf-ad47-4879689fb282","Type":"ContainerStarted","Data":"891f8b7728a503834b7471d89550d4e38caa4f3dda96978d0b0dcee7071c10a3"} Dec 16 08:05:56 crc kubenswrapper[4789]: I1216 08:05:56.984838 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 08:05:56 crc kubenswrapper[4789]: I1216 08:05:56.985879 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868049ed-5783-4da6-91b3-39954ca45bab","Type":"ContainerStarted","Data":"73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef"} Dec 16 08:05:56 crc kubenswrapper[4789]: I1216 08:05:56.986132 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:05:57 crc kubenswrapper[4789]: I1216 08:05:57.003122 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371972.851675 podStartE2EDuration="1m4.003101461s" podCreationTimestamp="2025-12-16 08:04:53 +0000 UTC" firstStartedPulling="2025-12-16 08:04:55.233846402 +0000 UTC m=+4433.495734021" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:05:57.001664804 +0000 UTC m=+4495.263552453" watchObservedRunningTime="2025-12-16 08:05:57.003101461 +0000 UTC m=+4495.264989090" Dec 16 08:05:57 crc kubenswrapper[4789]: I1216 08:05:57.031246 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.270589352 podStartE2EDuration="1m4.031228868s" podCreationTimestamp="2025-12-16 08:04:53 +0000 UTC" firstStartedPulling="2025-12-16 08:04:55.618256906 +0000 UTC m=+4433.880144525" lastFinishedPulling="2025-12-16 08:05:20.378896412 +0000 UTC m=+4458.640784041" observedRunningTime="2025-12-16 08:05:57.026124413 +0000 UTC m=+4495.288012092" watchObservedRunningTime="2025-12-16 08:05:57.031228868 +0000 UTC m=+4495.293116497" Dec 16 08:06:14 crc kubenswrapper[4789]: I1216 08:06:14.653982 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 08:06:14 crc kubenswrapper[4789]: I1216 08:06:14.950117 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.485823 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-9gvkk"] Dec 16 08:06:21 crc kubenswrapper[4789]: E1216 08:06:21.486748 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerName="init" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.486765 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerName="init" Dec 16 08:06:21 crc kubenswrapper[4789]: E1216 08:06:21.486793 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerName="dnsmasq-dns" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.486801 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerName="dnsmasq-dns" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.486983 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4920a698-9500-40bb-b5ba-0a9036ee5fcf" containerName="dnsmasq-dns" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.487957 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.528947 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-9gvkk"] Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.544133 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcm4n\" (UniqueName: \"kubernetes.io/projected/0af3676a-e3f5-4a94-a36d-b350e3b22769-kube-api-access-qcm4n\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.544188 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-dns-svc\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.544232 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-config\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.646187 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcm4n\" (UniqueName: \"kubernetes.io/projected/0af3676a-e3f5-4a94-a36d-b350e3b22769-kube-api-access-qcm4n\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.646592 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-dns-svc\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.646634 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-config\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.647566 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-config\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.648142 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-dns-svc\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.662620 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcm4n\" (UniqueName: \"kubernetes.io/projected/0af3676a-e3f5-4a94-a36d-b350e3b22769-kube-api-access-qcm4n\") pod \"dnsmasq-dns-55db7cd99c-9gvkk\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.808006 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.927795 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:06:21 crc kubenswrapper[4789]: I1216 08:06:21.928138 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:06:22 crc kubenswrapper[4789]: I1216 08:06:22.097723 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:06:22 crc kubenswrapper[4789]: I1216 08:06:22.256517 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-9gvkk"] Dec 16 08:06:22 crc kubenswrapper[4789]: I1216 08:06:22.881148 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:06:23 crc kubenswrapper[4789]: I1216 08:06:23.182182 4789 generic.go:334] "Generic (PLEG): container finished" podID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerID="98c73c84bbd9e6e552dcfa4296254caefc9be43e5775f19646015cc31e5915f0" exitCode=0 Dec 16 08:06:23 crc kubenswrapper[4789]: I1216 08:06:23.182227 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" event={"ID":"0af3676a-e3f5-4a94-a36d-b350e3b22769","Type":"ContainerDied","Data":"98c73c84bbd9e6e552dcfa4296254caefc9be43e5775f19646015cc31e5915f0"} Dec 16 08:06:23 crc kubenswrapper[4789]: I1216 08:06:23.182256 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" event={"ID":"0af3676a-e3f5-4a94-a36d-b350e3b22769","Type":"ContainerStarted","Data":"aea66839aa342f93398e60512eb78545242773c1c71edc9bcea92cb08b6c9e27"} Dec 16 08:06:23 crc kubenswrapper[4789]: I1216 08:06:23.878686 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="20764076-1e10-41bf-ad47-4879689fb282" containerName="rabbitmq" containerID="cri-o://891f8b7728a503834b7471d89550d4e38caa4f3dda96978d0b0dcee7071c10a3" gracePeriod=604799 Dec 16 08:06:24 crc kubenswrapper[4789]: I1216 08:06:24.196062 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" event={"ID":"0af3676a-e3f5-4a94-a36d-b350e3b22769","Type":"ContainerStarted","Data":"cec414a1d477528bfbb8b0c519c7e253da6e5b427769252601980687bac66252"} Dec 16 08:06:24 crc kubenswrapper[4789]: I1216 08:06:24.196252 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:24 crc kubenswrapper[4789]: I1216 08:06:24.231849 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" podStartSLOduration=3.231820965 podStartE2EDuration="3.231820965s" podCreationTimestamp="2025-12-16 08:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:06:24.225023969 +0000 UTC m=+4522.486911618" watchObservedRunningTime="2025-12-16 08:06:24.231820965 +0000 UTC m=+4522.493708604" Dec 16 08:06:24 crc kubenswrapper[4789]: I1216 08:06:24.552741 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="868049ed-5783-4da6-91b3-39954ca45bab" containerName="rabbitmq" containerID="cri-o://73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef" gracePeriod=604799 Dec 16 08:06:24 crc kubenswrapper[4789]: I1216 08:06:24.651420 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="20764076-1e10-41bf-ad47-4879689fb282" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.241:5672: connect: connection refused" Dec 16 08:06:24 crc kubenswrapper[4789]: I1216 08:06:24.948673 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="868049ed-5783-4da6-91b3-39954ca45bab" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.242:5672: connect: connection refused" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.244379 4789 generic.go:334] "Generic (PLEG): container finished" podID="20764076-1e10-41bf-ad47-4879689fb282" containerID="891f8b7728a503834b7471d89550d4e38caa4f3dda96978d0b0dcee7071c10a3" exitCode=0 Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.245230 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20764076-1e10-41bf-ad47-4879689fb282","Type":"ContainerDied","Data":"891f8b7728a503834b7471d89550d4e38caa4f3dda96978d0b0dcee7071c10a3"} Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.430521 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487408 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20764076-1e10-41bf-ad47-4879689fb282-pod-info\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487463 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20764076-1e10-41bf-ad47-4879689fb282-erlang-cookie-secret\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487538 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-erlang-cookie\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487569 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-plugins\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487622 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-server-conf\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487658 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-plugins-conf\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487717 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-confd\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487761 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmqrv\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-kube-api-access-wmqrv\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.487879 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"20764076-1e10-41bf-ad47-4879689fb282\" (UID: \"20764076-1e10-41bf-ad47-4879689fb282\") " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.488159 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.488520 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.488562 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.565395 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-kube-api-access-wmqrv" (OuterVolumeSpecName: "kube-api-access-wmqrv") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "kube-api-access-wmqrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.565520 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/20764076-1e10-41bf-ad47-4879689fb282-pod-info" (OuterVolumeSpecName: "pod-info") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.565688 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20764076-1e10-41bf-ad47-4879689fb282-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.565939 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09" (OuterVolumeSpecName: "persistence") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "pvc-29cd3bad-8a0d-44de-839a-3faee902fc09". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.574029 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-server-conf" (OuterVolumeSpecName: "server-conf") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590064 4789 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20764076-1e10-41bf-ad47-4879689fb282-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590096 4789 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20764076-1e10-41bf-ad47-4879689fb282-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590108 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590117 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590125 4789 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590134 4789 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20764076-1e10-41bf-ad47-4879689fb282-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590143 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmqrv\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-kube-api-access-wmqrv\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.590174 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") on node \"crc\" " Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.622727 4789 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.622966 4789 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-29cd3bad-8a0d-44de-839a-3faee902fc09" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09") on node "crc" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.631167 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "20764076-1e10-41bf-ad47-4879689fb282" (UID: "20764076-1e10-41bf-ad47-4879689fb282"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.691765 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20764076-1e10-41bf-ad47-4879689fb282-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:30 crc kubenswrapper[4789]: I1216 08:06:30.692148 4789 reconciler_common.go:293] "Volume detached for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.000503 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.096773 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868049ed-5783-4da6-91b3-39954ca45bab-pod-info\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.096842 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-plugins\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.096876 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-confd\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.096943 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmjtz\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-kube-api-access-zmjtz\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.096970 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-plugins-conf\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.097021 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868049ed-5783-4da6-91b3-39954ca45bab-erlang-cookie-secret\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.097049 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-server-conf\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.097144 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-erlang-cookie\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.097289 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"868049ed-5783-4da6-91b3-39954ca45bab\" (UID: \"868049ed-5783-4da6-91b3-39954ca45bab\") " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.097739 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.097935 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.098199 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.098468 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.098538 4789 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.100297 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-kube-api-access-zmjtz" (OuterVolumeSpecName: "kube-api-access-zmjtz") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "kube-api-access-zmjtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.101312 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/868049ed-5783-4da6-91b3-39954ca45bab-pod-info" (OuterVolumeSpecName: "pod-info") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.103939 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868049ed-5783-4da6-91b3-39954ca45bab-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.108226 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d" (OuterVolumeSpecName: "persistence") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.116030 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-server-conf" (OuterVolumeSpecName: "server-conf") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.165281 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "868049ed-5783-4da6-91b3-39954ca45bab" (UID: "868049ed-5783-4da6-91b3-39954ca45bab"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.200030 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.200083 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") on node \"crc\" " Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.200101 4789 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/868049ed-5783-4da6-91b3-39954ca45bab-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.200113 4789 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.200125 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmjtz\" (UniqueName: \"kubernetes.io/projected/868049ed-5783-4da6-91b3-39954ca45bab-kube-api-access-zmjtz\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.200137 4789 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/868049ed-5783-4da6-91b3-39954ca45bab-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.200148 4789 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/868049ed-5783-4da6-91b3-39954ca45bab-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.255571 4789 generic.go:334] "Generic (PLEG): container finished" podID="868049ed-5783-4da6-91b3-39954ca45bab" containerID="73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef" exitCode=0 Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.255613 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868049ed-5783-4da6-91b3-39954ca45bab","Type":"ContainerDied","Data":"73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef"} Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.255660 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"868049ed-5783-4da6-91b3-39954ca45bab","Type":"ContainerDied","Data":"431f67cf54f0b78cd4cefdccb3029ebfa4627313b521f0c292be04fe75953fc8"} Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.255670 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.255685 4789 scope.go:117] "RemoveContainer" containerID="73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.260473 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20764076-1e10-41bf-ad47-4879689fb282","Type":"ContainerDied","Data":"2c1358d4ae4d6fa23ba7d8fe0fe0c08d371041130932f45766c7a277e2118302"} Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.260622 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.581221 4789 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.581437 4789 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d") on node "crc" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.604860 4789 reconciler_common.go:293] "Volume detached for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.653696 4789 scope.go:117] "RemoveContainer" containerID="dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.676486 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.691978 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.701998 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.715616 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.725261 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: E1216 08:06:31.725692 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868049ed-5783-4da6-91b3-39954ca45bab" containerName="setup-container" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.725713 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="868049ed-5783-4da6-91b3-39954ca45bab" containerName="setup-container" Dec 16 08:06:31 crc kubenswrapper[4789]: E1216 08:06:31.725733 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868049ed-5783-4da6-91b3-39954ca45bab" containerName="rabbitmq" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.725742 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="868049ed-5783-4da6-91b3-39954ca45bab" containerName="rabbitmq" Dec 16 08:06:31 crc kubenswrapper[4789]: E1216 08:06:31.725758 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20764076-1e10-41bf-ad47-4879689fb282" containerName="rabbitmq" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.725769 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="20764076-1e10-41bf-ad47-4879689fb282" containerName="rabbitmq" Dec 16 08:06:31 crc kubenswrapper[4789]: E1216 08:06:31.725789 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20764076-1e10-41bf-ad47-4879689fb282" containerName="setup-container" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.725797 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="20764076-1e10-41bf-ad47-4879689fb282" containerName="setup-container" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.726012 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="20764076-1e10-41bf-ad47-4879689fb282" containerName="rabbitmq" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.726037 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="868049ed-5783-4da6-91b3-39954ca45bab" containerName="rabbitmq" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.727098 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.729900 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.732671 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.733128 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.733751 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.734198 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.734378 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dl2b6" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.734509 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.736253 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.736462 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.736566 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.736770 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.736874 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8rgz7" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.743135 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.755971 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.782317 4789 scope.go:117] "RemoveContainer" containerID="73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef" Dec 16 08:06:31 crc kubenswrapper[4789]: E1216 08:06:31.782873 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef\": container with ID starting with 73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef not found: ID does not exist" containerID="73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.782964 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef"} err="failed to get container status \"73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef\": rpc error: code = NotFound desc = could not find container \"73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef\": container with ID starting with 73f24e673ac211eef69652fc1c7f7090ac9710ab06a21474899260041a82fdef not found: ID does not exist" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.782988 4789 scope.go:117] "RemoveContainer" containerID="dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695" Dec 16 08:06:31 crc kubenswrapper[4789]: E1216 08:06:31.783335 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695\": container with ID starting with dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695 not found: ID does not exist" containerID="dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.783367 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695"} err="failed to get container status \"dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695\": rpc error: code = NotFound desc = could not find container \"dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695\": container with ID starting with dd7e9b47cbbd6db018e851a77d5f69e9504ca4bb05678f0a26461bd69ff73695 not found: ID does not exist" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.783387 4789 scope.go:117] "RemoveContainer" containerID="891f8b7728a503834b7471d89550d4e38caa4f3dda96978d0b0dcee7071c10a3" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.805086 4789 scope.go:117] "RemoveContainer" containerID="42b7c0f1608e472d03092ed54d07044d721c7ca1f35b1972eaa82e0aed56603b" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812604 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b059bdb-f5c3-47eb-88f4-b89b3529450c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812651 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812681 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f88e8a07-49e9-4e55-9b79-18990a74ac97-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812703 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f88e8a07-49e9-4e55-9b79-18990a74ac97-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812735 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812758 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812777 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b059bdb-f5c3-47eb-88f4-b89b3529450c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812796 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812819 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f88e8a07-49e9-4e55-9b79-18990a74ac97-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812834 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812858 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfzgf\" (UniqueName: \"kubernetes.io/projected/1b059bdb-f5c3-47eb-88f4-b89b3529450c-kube-api-access-dfzgf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.812971 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b059bdb-f5c3-47eb-88f4-b89b3529450c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.813009 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.813074 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b059bdb-f5c3-47eb-88f4-b89b3529450c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.813104 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5kph\" (UniqueName: \"kubernetes.io/projected/f88e8a07-49e9-4e55-9b79-18990a74ac97-kube-api-access-c5kph\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.813116 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.813163 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f88e8a07-49e9-4e55-9b79-18990a74ac97-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.813190 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.813206 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.862289 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57484c487-vtj5c"] Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.863210 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57484c487-vtj5c" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerName="dnsmasq-dns" containerID="cri-o://c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f" gracePeriod=10 Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914369 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914550 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b059bdb-f5c3-47eb-88f4-b89b3529450c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914586 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914628 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f88e8a07-49e9-4e55-9b79-18990a74ac97-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914660 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f88e8a07-49e9-4e55-9b79-18990a74ac97-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914696 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914716 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914739 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b059bdb-f5c3-47eb-88f4-b89b3529450c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914775 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914816 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f88e8a07-49e9-4e55-9b79-18990a74ac97-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914839 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914865 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfzgf\" (UniqueName: \"kubernetes.io/projected/1b059bdb-f5c3-47eb-88f4-b89b3529450c-kube-api-access-dfzgf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914886 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b059bdb-f5c3-47eb-88f4-b89b3529450c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914903 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914958 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b059bdb-f5c3-47eb-88f4-b89b3529450c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914981 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5kph\" (UniqueName: \"kubernetes.io/projected/f88e8a07-49e9-4e55-9b79-18990a74ac97-kube-api-access-c5kph\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.914984 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.915020 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f88e8a07-49e9-4e55-9b79-18990a74ac97-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.915054 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.915642 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.915851 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.916572 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f88e8a07-49e9-4e55-9b79-18990a74ac97-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.916666 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b059bdb-f5c3-47eb-88f4-b89b3529450c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.917129 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f88e8a07-49e9-4e55-9b79-18990a74ac97-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.917823 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.918604 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b059bdb-f5c3-47eb-88f4-b89b3529450c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.920371 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b059bdb-f5c3-47eb-88f4-b89b3529450c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.920388 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f88e8a07-49e9-4e55-9b79-18990a74ac97-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.920432 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f88e8a07-49e9-4e55-9b79-18990a74ac97-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.920793 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b059bdb-f5c3-47eb-88f4-b89b3529450c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.921069 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b059bdb-f5c3-47eb-88f4-b89b3529450c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.922520 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.922557 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2f37d39606cb1deeffe1438d067eb417957ccefc73ea0d2a89b934fb2a08fd9/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.922830 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.922859 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b609b3066bfbcfea006523a4f9903d81b2cd3d231d4ffc49326ec9c9c517e442/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.927378 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f88e8a07-49e9-4e55-9b79-18990a74ac97-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.931473 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfzgf\" (UniqueName: \"kubernetes.io/projected/1b059bdb-f5c3-47eb-88f4-b89b3529450c-kube-api-access-dfzgf\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.939760 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5kph\" (UniqueName: \"kubernetes.io/projected/f88e8a07-49e9-4e55-9b79-18990a74ac97-kube-api-access-c5kph\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.951221 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-88a1a834-eee6-4ec3-9800-9bdc7b59129d\") pod \"rabbitmq-cell1-server-0\" (UID: \"f88e8a07-49e9-4e55-9b79-18990a74ac97\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:31 crc kubenswrapper[4789]: I1216 08:06:31.956765 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cd3bad-8a0d-44de-839a-3faee902fc09\") pod \"rabbitmq-server-0\" (UID: \"1b059bdb-f5c3-47eb-88f4-b89b3529450c\") " pod="openstack/rabbitmq-server-0" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.054638 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.065309 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.152112 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20764076-1e10-41bf-ad47-4879689fb282" path="/var/lib/kubelet/pods/20764076-1e10-41bf-ad47-4879689fb282/volumes" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.152964 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868049ed-5783-4da6-91b3-39954ca45bab" path="/var/lib/kubelet/pods/868049ed-5783-4da6-91b3-39954ca45bab/volumes" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.256849 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.282242 4789 generic.go:334] "Generic (PLEG): container finished" podID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerID="c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f" exitCode=0 Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.282282 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-vtj5c" event={"ID":"4fa7df80-d58b-4152-a351-ab7e27f2e9d2","Type":"ContainerDied","Data":"c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f"} Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.282308 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57484c487-vtj5c" event={"ID":"4fa7df80-d58b-4152-a351-ab7e27f2e9d2","Type":"ContainerDied","Data":"f64ae994115417ebd036ea42cb0586d67f02828c8043bb49d635b87744193f18"} Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.282326 4789 scope.go:117] "RemoveContainer" containerID="c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.282382 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57484c487-vtj5c" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.302192 4789 scope.go:117] "RemoveContainer" containerID="78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.318078 4789 scope.go:117] "RemoveContainer" containerID="c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f" Dec 16 08:06:32 crc kubenswrapper[4789]: E1216 08:06:32.318480 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f\": container with ID starting with c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f not found: ID does not exist" containerID="c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.318583 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f"} err="failed to get container status \"c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f\": rpc error: code = NotFound desc = could not find container \"c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f\": container with ID starting with c72e20c79be660abfb7c201ef20aef7ce9693082a1730c82b17b52a50d21a30f not found: ID does not exist" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.318664 4789 scope.go:117] "RemoveContainer" containerID="78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032" Dec 16 08:06:32 crc kubenswrapper[4789]: E1216 08:06:32.319991 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032\": container with ID starting with 78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032 not found: ID does not exist" containerID="78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.320037 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032"} err="failed to get container status \"78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032\": rpc error: code = NotFound desc = could not find container \"78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032\": container with ID starting with 78d53eb09bea69aaa8f306a3f9f8d284483d9e6b2bb571e63626fe12f8b99032 not found: ID does not exist" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.323752 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7gqz\" (UniqueName: \"kubernetes.io/projected/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-kube-api-access-l7gqz\") pod \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.323801 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-dns-svc\") pod \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.323859 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-config\") pod \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\" (UID: \"4fa7df80-d58b-4152-a351-ab7e27f2e9d2\") " Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.329677 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-kube-api-access-l7gqz" (OuterVolumeSpecName: "kube-api-access-l7gqz") pod "4fa7df80-d58b-4152-a351-ab7e27f2e9d2" (UID: "4fa7df80-d58b-4152-a351-ab7e27f2e9d2"). InnerVolumeSpecName "kube-api-access-l7gqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.359280 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-config" (OuterVolumeSpecName: "config") pod "4fa7df80-d58b-4152-a351-ab7e27f2e9d2" (UID: "4fa7df80-d58b-4152-a351-ab7e27f2e9d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.370250 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4fa7df80-d58b-4152-a351-ab7e27f2e9d2" (UID: "4fa7df80-d58b-4152-a351-ab7e27f2e9d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.425971 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.426012 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7gqz\" (UniqueName: \"kubernetes.io/projected/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-kube-api-access-l7gqz\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.426022 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa7df80-d58b-4152-a351-ab7e27f2e9d2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.582217 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.589521 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.614159 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57484c487-vtj5c"] Dec 16 08:06:32 crc kubenswrapper[4789]: I1216 08:06:32.618803 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57484c487-vtj5c"] Dec 16 08:06:33 crc kubenswrapper[4789]: I1216 08:06:33.290269 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f88e8a07-49e9-4e55-9b79-18990a74ac97","Type":"ContainerStarted","Data":"89fd79fc72750cb78a5e1ada52b32be96efd26d0ae912279edc0156490573b33"} Dec 16 08:06:33 crc kubenswrapper[4789]: I1216 08:06:33.292904 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b059bdb-f5c3-47eb-88f4-b89b3529450c","Type":"ContainerStarted","Data":"1ea06f5dc7cad5065fd4f3053880be90cc476387b481059a7fc3a5a762023c9a"} Dec 16 08:06:34 crc kubenswrapper[4789]: I1216 08:06:34.116709 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" path="/var/lib/kubelet/pods/4fa7df80-d58b-4152-a351-ab7e27f2e9d2/volumes" Dec 16 08:06:34 crc kubenswrapper[4789]: I1216 08:06:34.303608 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b059bdb-f5c3-47eb-88f4-b89b3529450c","Type":"ContainerStarted","Data":"0f75c7ab531c4525bc99628517617e6612866a51c5e4b8ae944cf727b118fbe9"} Dec 16 08:06:34 crc kubenswrapper[4789]: I1216 08:06:34.310206 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f88e8a07-49e9-4e55-9b79-18990a74ac97","Type":"ContainerStarted","Data":"cd8e598da3bbdac78d596cacb46f097f96666aa27bdcafa5551dbbd574ce2cc4"} Dec 16 08:06:51 crc kubenswrapper[4789]: I1216 08:06:51.928223 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:06:51 crc kubenswrapper[4789]: I1216 08:06:51.928784 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:07:06 crc kubenswrapper[4789]: I1216 08:07:06.533030 4789 generic.go:334] "Generic (PLEG): container finished" podID="1b059bdb-f5c3-47eb-88f4-b89b3529450c" containerID="0f75c7ab531c4525bc99628517617e6612866a51c5e4b8ae944cf727b118fbe9" exitCode=0 Dec 16 08:07:06 crc kubenswrapper[4789]: I1216 08:07:06.533137 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b059bdb-f5c3-47eb-88f4-b89b3529450c","Type":"ContainerDied","Data":"0f75c7ab531c4525bc99628517617e6612866a51c5e4b8ae944cf727b118fbe9"} Dec 16 08:07:06 crc kubenswrapper[4789]: I1216 08:07:06.536689 4789 generic.go:334] "Generic (PLEG): container finished" podID="f88e8a07-49e9-4e55-9b79-18990a74ac97" containerID="cd8e598da3bbdac78d596cacb46f097f96666aa27bdcafa5551dbbd574ce2cc4" exitCode=0 Dec 16 08:07:06 crc kubenswrapper[4789]: I1216 08:07:06.536741 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f88e8a07-49e9-4e55-9b79-18990a74ac97","Type":"ContainerDied","Data":"cd8e598da3bbdac78d596cacb46f097f96666aa27bdcafa5551dbbd574ce2cc4"} Dec 16 08:07:07 crc kubenswrapper[4789]: I1216 08:07:07.544522 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1b059bdb-f5c3-47eb-88f4-b89b3529450c","Type":"ContainerStarted","Data":"14c7bbcc9d9fac3d4b6a4ed7e547b3cfbd802b44e9e655eaca6ffaa1fdde3408"} Dec 16 08:07:07 crc kubenswrapper[4789]: I1216 08:07:07.546275 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 08:07:07 crc kubenswrapper[4789]: I1216 08:07:07.547256 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f88e8a07-49e9-4e55-9b79-18990a74ac97","Type":"ContainerStarted","Data":"68cdb1789ce78c489a4d66f63c23ba6e0b0cb1f5aa6a52281e17e164630d337f"} Dec 16 08:07:07 crc kubenswrapper[4789]: I1216 08:07:07.547939 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:07:07 crc kubenswrapper[4789]: I1216 08:07:07.567509 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.56747851 podStartE2EDuration="36.56747851s" podCreationTimestamp="2025-12-16 08:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:07:07.565303167 +0000 UTC m=+4565.827190796" watchObservedRunningTime="2025-12-16 08:07:07.56747851 +0000 UTC m=+4565.829366139" Dec 16 08:07:07 crc kubenswrapper[4789]: I1216 08:07:07.591516 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.591496967 podStartE2EDuration="36.591496967s" podCreationTimestamp="2025-12-16 08:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:07:07.58630565 +0000 UTC m=+4565.848193289" watchObservedRunningTime="2025-12-16 08:07:07.591496967 +0000 UTC m=+4565.853384596" Dec 16 08:07:21 crc kubenswrapper[4789]: I1216 08:07:21.927587 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:07:21 crc kubenswrapper[4789]: I1216 08:07:21.928141 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:07:21 crc kubenswrapper[4789]: I1216 08:07:21.928201 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:07:21 crc kubenswrapper[4789]: I1216 08:07:21.928789 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:07:21 crc kubenswrapper[4789]: I1216 08:07:21.928838 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" gracePeriod=600 Dec 16 08:07:22 crc kubenswrapper[4789]: I1216 08:07:22.059142 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 08:07:22 crc kubenswrapper[4789]: E1216 08:07:22.059495 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:07:22 crc kubenswrapper[4789]: I1216 08:07:22.068115 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 08:07:22 crc kubenswrapper[4789]: I1216 08:07:22.676625 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" exitCode=0 Dec 16 08:07:22 crc kubenswrapper[4789]: I1216 08:07:22.676671 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193"} Dec 16 08:07:22 crc kubenswrapper[4789]: I1216 08:07:22.676714 4789 scope.go:117] "RemoveContainer" containerID="c2e924339a8b79f5acea702841f03d79960142f0749c3e2bfe47fc0008691ee8" Dec 16 08:07:22 crc kubenswrapper[4789]: I1216 08:07:22.677305 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:07:22 crc kubenswrapper[4789]: E1216 08:07:22.677673 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.728616 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:07:27 crc kubenswrapper[4789]: E1216 08:07:27.729449 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerName="dnsmasq-dns" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.729463 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerName="dnsmasq-dns" Dec 16 08:07:27 crc kubenswrapper[4789]: E1216 08:07:27.729478 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerName="init" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.729484 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerName="init" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.729663 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa7df80-d58b-4152-a351-ab7e27f2e9d2" containerName="dnsmasq-dns" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.730209 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.732230 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lsf8v" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.740249 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.747095 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xk5m\" (UniqueName: \"kubernetes.io/projected/cf6d2734-0938-4416-b7f9-04533a174780-kube-api-access-8xk5m\") pod \"mariadb-client-1-default\" (UID: \"cf6d2734-0938-4416-b7f9-04533a174780\") " pod="openstack/mariadb-client-1-default" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.848472 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xk5m\" (UniqueName: \"kubernetes.io/projected/cf6d2734-0938-4416-b7f9-04533a174780-kube-api-access-8xk5m\") pod \"mariadb-client-1-default\" (UID: \"cf6d2734-0938-4416-b7f9-04533a174780\") " pod="openstack/mariadb-client-1-default" Dec 16 08:07:27 crc kubenswrapper[4789]: I1216 08:07:27.866267 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xk5m\" (UniqueName: \"kubernetes.io/projected/cf6d2734-0938-4416-b7f9-04533a174780-kube-api-access-8xk5m\") pod \"mariadb-client-1-default\" (UID: \"cf6d2734-0938-4416-b7f9-04533a174780\") " pod="openstack/mariadb-client-1-default" Dec 16 08:07:28 crc kubenswrapper[4789]: I1216 08:07:28.074452 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:07:28 crc kubenswrapper[4789]: I1216 08:07:28.532299 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:07:28 crc kubenswrapper[4789]: I1216 08:07:28.720728 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"cf6d2734-0938-4416-b7f9-04533a174780","Type":"ContainerStarted","Data":"ac13e521527186a0795b59ba7c8baca51f0d523b30a0dacb0a69db125f0ae56b"} Dec 16 08:07:29 crc kubenswrapper[4789]: I1216 08:07:29.729102 4789 generic.go:334] "Generic (PLEG): container finished" podID="cf6d2734-0938-4416-b7f9-04533a174780" containerID="60e755ba1242bfe4598b34685476e17005acf5d2bf04a5e0e82f9adeed061b28" exitCode=0 Dec 16 08:07:29 crc kubenswrapper[4789]: I1216 08:07:29.729172 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"cf6d2734-0938-4416-b7f9-04533a174780","Type":"ContainerDied","Data":"60e755ba1242bfe4598b34685476e17005acf5d2bf04a5e0e82f9adeed061b28"} Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.101508 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.127511 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_cf6d2734-0938-4416-b7f9-04533a174780/mariadb-client-1-default/0.log" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.156977 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.159394 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.296837 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xk5m\" (UniqueName: \"kubernetes.io/projected/cf6d2734-0938-4416-b7f9-04533a174780-kube-api-access-8xk5m\") pod \"cf6d2734-0938-4416-b7f9-04533a174780\" (UID: \"cf6d2734-0938-4416-b7f9-04533a174780\") " Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.302083 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6d2734-0938-4416-b7f9-04533a174780-kube-api-access-8xk5m" (OuterVolumeSpecName: "kube-api-access-8xk5m") pod "cf6d2734-0938-4416-b7f9-04533a174780" (UID: "cf6d2734-0938-4416-b7f9-04533a174780"). InnerVolumeSpecName "kube-api-access-8xk5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.398794 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xk5m\" (UniqueName: \"kubernetes.io/projected/cf6d2734-0938-4416-b7f9-04533a174780-kube-api-access-8xk5m\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.596064 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:07:31 crc kubenswrapper[4789]: E1216 08:07:31.596496 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6d2734-0938-4416-b7f9-04533a174780" containerName="mariadb-client-1-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.596519 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6d2734-0938-4416-b7f9-04533a174780" containerName="mariadb-client-1-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.596906 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6d2734-0938-4416-b7f9-04533a174780" containerName="mariadb-client-1-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.597522 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.606827 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.702850 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlxh2\" (UniqueName: \"kubernetes.io/projected/e2ea055e-11f1-40ab-8fea-309a532de281-kube-api-access-dlxh2\") pod \"mariadb-client-2-default\" (UID: \"e2ea055e-11f1-40ab-8fea-309a532de281\") " pod="openstack/mariadb-client-2-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.746695 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac13e521527186a0795b59ba7c8baca51f0d523b30a0dacb0a69db125f0ae56b" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.746746 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.805463 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlxh2\" (UniqueName: \"kubernetes.io/projected/e2ea055e-11f1-40ab-8fea-309a532de281-kube-api-access-dlxh2\") pod \"mariadb-client-2-default\" (UID: \"e2ea055e-11f1-40ab-8fea-309a532de281\") " pod="openstack/mariadb-client-2-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.827222 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlxh2\" (UniqueName: \"kubernetes.io/projected/e2ea055e-11f1-40ab-8fea-309a532de281-kube-api-access-dlxh2\") pod \"mariadb-client-2-default\" (UID: \"e2ea055e-11f1-40ab-8fea-309a532de281\") " pod="openstack/mariadb-client-2-default" Dec 16 08:07:31 crc kubenswrapper[4789]: I1216 08:07:31.922235 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:07:32 crc kubenswrapper[4789]: I1216 08:07:32.118972 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6d2734-0938-4416-b7f9-04533a174780" path="/var/lib/kubelet/pods/cf6d2734-0938-4416-b7f9-04533a174780/volumes" Dec 16 08:07:32 crc kubenswrapper[4789]: I1216 08:07:32.435798 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:07:32 crc kubenswrapper[4789]: W1216 08:07:32.439161 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2ea055e_11f1_40ab_8fea_309a532de281.slice/crio-7d2884236f204731c1e47e23e75f73ed562b30e252af4fb5c9e7af4807d4199c WatchSource:0}: Error finding container 7d2884236f204731c1e47e23e75f73ed562b30e252af4fb5c9e7af4807d4199c: Status 404 returned error can't find the container with id 7d2884236f204731c1e47e23e75f73ed562b30e252af4fb5c9e7af4807d4199c Dec 16 08:07:32 crc kubenswrapper[4789]: I1216 08:07:32.754165 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"e2ea055e-11f1-40ab-8fea-309a532de281","Type":"ContainerStarted","Data":"763907aa41f0a740f5fb126fe7a601bc456599e3461c9b622f13313a607321bf"} Dec 16 08:07:32 crc kubenswrapper[4789]: I1216 08:07:32.754506 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"e2ea055e-11f1-40ab-8fea-309a532de281","Type":"ContainerStarted","Data":"7d2884236f204731c1e47e23e75f73ed562b30e252af4fb5c9e7af4807d4199c"} Dec 16 08:07:32 crc kubenswrapper[4789]: I1216 08:07:32.769460 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.7694396000000001 podStartE2EDuration="1.7694396s" podCreationTimestamp="2025-12-16 08:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:07:32.76574239 +0000 UTC m=+4591.027630019" watchObservedRunningTime="2025-12-16 08:07:32.7694396 +0000 UTC m=+4591.031327229" Dec 16 08:07:33 crc kubenswrapper[4789]: I1216 08:07:33.777008 4789 generic.go:334] "Generic (PLEG): container finished" podID="e2ea055e-11f1-40ab-8fea-309a532de281" containerID="763907aa41f0a740f5fb126fe7a601bc456599e3461c9b622f13313a607321bf" exitCode=1 Dec 16 08:07:33 crc kubenswrapper[4789]: I1216 08:07:33.777059 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"e2ea055e-11f1-40ab-8fea-309a532de281","Type":"ContainerDied","Data":"763907aa41f0a740f5fb126fe7a601bc456599e3461c9b622f13313a607321bf"} Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.104777 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:07:35 crc kubenswrapper[4789]: E1216 08:07:35.105494 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.229315 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.272071 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.281804 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.359853 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlxh2\" (UniqueName: \"kubernetes.io/projected/e2ea055e-11f1-40ab-8fea-309a532de281-kube-api-access-dlxh2\") pod \"e2ea055e-11f1-40ab-8fea-309a532de281\" (UID: \"e2ea055e-11f1-40ab-8fea-309a532de281\") " Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.365612 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ea055e-11f1-40ab-8fea-309a532de281-kube-api-access-dlxh2" (OuterVolumeSpecName: "kube-api-access-dlxh2") pod "e2ea055e-11f1-40ab-8fea-309a532de281" (UID: "e2ea055e-11f1-40ab-8fea-309a532de281"). InnerVolumeSpecName "kube-api-access-dlxh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.461817 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlxh2\" (UniqueName: \"kubernetes.io/projected/e2ea055e-11f1-40ab-8fea-309a532de281-kube-api-access-dlxh2\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.715977 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:07:35 crc kubenswrapper[4789]: E1216 08:07:35.716805 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ea055e-11f1-40ab-8fea-309a532de281" containerName="mariadb-client-2-default" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.716842 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ea055e-11f1-40ab-8fea-309a532de281" containerName="mariadb-client-2-default" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.717267 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ea055e-11f1-40ab-8fea-309a532de281" containerName="mariadb-client-2-default" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.718466 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.726982 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.766640 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjmfv\" (UniqueName: \"kubernetes.io/projected/736ace1e-d7e3-4656-a393-02c80acb01f4-kube-api-access-fjmfv\") pod \"mariadb-client-1\" (UID: \"736ace1e-d7e3-4656-a393-02c80acb01f4\") " pod="openstack/mariadb-client-1" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.798622 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d2884236f204731c1e47e23e75f73ed562b30e252af4fb5c9e7af4807d4199c" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.798732 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.868161 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjmfv\" (UniqueName: \"kubernetes.io/projected/736ace1e-d7e3-4656-a393-02c80acb01f4-kube-api-access-fjmfv\") pod \"mariadb-client-1\" (UID: \"736ace1e-d7e3-4656-a393-02c80acb01f4\") " pod="openstack/mariadb-client-1" Dec 16 08:07:35 crc kubenswrapper[4789]: I1216 08:07:35.895650 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjmfv\" (UniqueName: \"kubernetes.io/projected/736ace1e-d7e3-4656-a393-02c80acb01f4-kube-api-access-fjmfv\") pod \"mariadb-client-1\" (UID: \"736ace1e-d7e3-4656-a393-02c80acb01f4\") " pod="openstack/mariadb-client-1" Dec 16 08:07:36 crc kubenswrapper[4789]: I1216 08:07:36.046569 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:07:36 crc kubenswrapper[4789]: I1216 08:07:36.115374 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ea055e-11f1-40ab-8fea-309a532de281" path="/var/lib/kubelet/pods/e2ea055e-11f1-40ab-8fea-309a532de281/volumes" Dec 16 08:07:36 crc kubenswrapper[4789]: I1216 08:07:36.590632 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:07:36 crc kubenswrapper[4789]: I1216 08:07:36.807841 4789 generic.go:334] "Generic (PLEG): container finished" podID="736ace1e-d7e3-4656-a393-02c80acb01f4" containerID="d2f44614e4ebb285256bfd2e3ee9e246d59dfcb93529e43f87132f1ab65c7cd0" exitCode=0 Dec 16 08:07:36 crc kubenswrapper[4789]: I1216 08:07:36.807897 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"736ace1e-d7e3-4656-a393-02c80acb01f4","Type":"ContainerDied","Data":"d2f44614e4ebb285256bfd2e3ee9e246d59dfcb93529e43f87132f1ab65c7cd0"} Dec 16 08:07:36 crc kubenswrapper[4789]: I1216 08:07:36.808254 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"736ace1e-d7e3-4656-a393-02c80acb01f4","Type":"ContainerStarted","Data":"6c07e345647b6794a5f6f8e585ed62ec2db9f778a6c4440620256df680067819"} Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.242261 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.263965 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_736ace1e-d7e3-4656-a393-02c80acb01f4/mariadb-client-1/0.log" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.295241 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.304223 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.317999 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjmfv\" (UniqueName: \"kubernetes.io/projected/736ace1e-d7e3-4656-a393-02c80acb01f4-kube-api-access-fjmfv\") pod \"736ace1e-d7e3-4656-a393-02c80acb01f4\" (UID: \"736ace1e-d7e3-4656-a393-02c80acb01f4\") " Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.326160 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736ace1e-d7e3-4656-a393-02c80acb01f4-kube-api-access-fjmfv" (OuterVolumeSpecName: "kube-api-access-fjmfv") pod "736ace1e-d7e3-4656-a393-02c80acb01f4" (UID: "736ace1e-d7e3-4656-a393-02c80acb01f4"). InnerVolumeSpecName "kube-api-access-fjmfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.420293 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjmfv\" (UniqueName: \"kubernetes.io/projected/736ace1e-d7e3-4656-a393-02c80acb01f4-kube-api-access-fjmfv\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.760962 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:07:38 crc kubenswrapper[4789]: E1216 08:07:38.761442 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736ace1e-d7e3-4656-a393-02c80acb01f4" containerName="mariadb-client-1" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.761471 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="736ace1e-d7e3-4656-a393-02c80acb01f4" containerName="mariadb-client-1" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.761728 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="736ace1e-d7e3-4656-a393-02c80acb01f4" containerName="mariadb-client-1" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.762579 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.774167 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.828995 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8hb\" (UniqueName: \"kubernetes.io/projected/d5397d17-65e5-4530-9532-d152a7617756-kube-api-access-vg8hb\") pod \"mariadb-client-4-default\" (UID: \"d5397d17-65e5-4530-9532-d152a7617756\") " pod="openstack/mariadb-client-4-default" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.833795 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c07e345647b6794a5f6f8e585ed62ec2db9f778a6c4440620256df680067819" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.833882 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.930750 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8hb\" (UniqueName: \"kubernetes.io/projected/d5397d17-65e5-4530-9532-d152a7617756-kube-api-access-vg8hb\") pod \"mariadb-client-4-default\" (UID: \"d5397d17-65e5-4530-9532-d152a7617756\") " pod="openstack/mariadb-client-4-default" Dec 16 08:07:38 crc kubenswrapper[4789]: I1216 08:07:38.950040 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8hb\" (UniqueName: \"kubernetes.io/projected/d5397d17-65e5-4530-9532-d152a7617756-kube-api-access-vg8hb\") pod \"mariadb-client-4-default\" (UID: \"d5397d17-65e5-4530-9532-d152a7617756\") " pod="openstack/mariadb-client-4-default" Dec 16 08:07:39 crc kubenswrapper[4789]: I1216 08:07:39.096413 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:07:39 crc kubenswrapper[4789]: I1216 08:07:39.629450 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:07:39 crc kubenswrapper[4789]: W1216 08:07:39.634203 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5397d17_65e5_4530_9532_d152a7617756.slice/crio-057f4c61f3df6e2c09191b4978aec2606e185cfa945ba10f7a37d3df86fcc91d WatchSource:0}: Error finding container 057f4c61f3df6e2c09191b4978aec2606e185cfa945ba10f7a37d3df86fcc91d: Status 404 returned error can't find the container with id 057f4c61f3df6e2c09191b4978aec2606e185cfa945ba10f7a37d3df86fcc91d Dec 16 08:07:39 crc kubenswrapper[4789]: I1216 08:07:39.842208 4789 generic.go:334] "Generic (PLEG): container finished" podID="d5397d17-65e5-4530-9532-d152a7617756" containerID="4ff3fa5bc5ddd34edc97625376d9c1b6b912464bf54b886bdade75bcbac04294" exitCode=0 Dec 16 08:07:39 crc kubenswrapper[4789]: I1216 08:07:39.842308 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d5397d17-65e5-4530-9532-d152a7617756","Type":"ContainerDied","Data":"4ff3fa5bc5ddd34edc97625376d9c1b6b912464bf54b886bdade75bcbac04294"} Dec 16 08:07:39 crc kubenswrapper[4789]: I1216 08:07:39.842540 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d5397d17-65e5-4530-9532-d152a7617756","Type":"ContainerStarted","Data":"057f4c61f3df6e2c09191b4978aec2606e185cfa945ba10f7a37d3df86fcc91d"} Dec 16 08:07:40 crc kubenswrapper[4789]: I1216 08:07:40.117163 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736ace1e-d7e3-4656-a393-02c80acb01f4" path="/var/lib/kubelet/pods/736ace1e-d7e3-4656-a393-02c80acb01f4/volumes" Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.314666 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.336251 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_d5397d17-65e5-4530-9532-d152a7617756/mariadb-client-4-default/0.log" Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.360308 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.365939 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.467741 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg8hb\" (UniqueName: \"kubernetes.io/projected/d5397d17-65e5-4530-9532-d152a7617756-kube-api-access-vg8hb\") pod \"d5397d17-65e5-4530-9532-d152a7617756\" (UID: \"d5397d17-65e5-4530-9532-d152a7617756\") " Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.474460 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5397d17-65e5-4530-9532-d152a7617756-kube-api-access-vg8hb" (OuterVolumeSpecName: "kube-api-access-vg8hb") pod "d5397d17-65e5-4530-9532-d152a7617756" (UID: "d5397d17-65e5-4530-9532-d152a7617756"). InnerVolumeSpecName "kube-api-access-vg8hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.569845 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg8hb\" (UniqueName: \"kubernetes.io/projected/d5397d17-65e5-4530-9532-d152a7617756-kube-api-access-vg8hb\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.858681 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="057f4c61f3df6e2c09191b4978aec2606e185cfa945ba10f7a37d3df86fcc91d" Dec 16 08:07:41 crc kubenswrapper[4789]: I1216 08:07:41.858712 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 16 08:07:42 crc kubenswrapper[4789]: I1216 08:07:42.138902 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5397d17-65e5-4530-9532-d152a7617756" path="/var/lib/kubelet/pods/d5397d17-65e5-4530-9532-d152a7617756/volumes" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.688045 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:07:44 crc kubenswrapper[4789]: E1216 08:07:44.688743 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5397d17-65e5-4530-9532-d152a7617756" containerName="mariadb-client-4-default" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.688760 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5397d17-65e5-4530-9532-d152a7617756" containerName="mariadb-client-4-default" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.688952 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5397d17-65e5-4530-9532-d152a7617756" containerName="mariadb-client-4-default" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.689526 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.692640 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lsf8v" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.699700 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.728886 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4zlm\" (UniqueName: \"kubernetes.io/projected/dafae949-909b-4ac8-877d-6b5a26c647e2-kube-api-access-r4zlm\") pod \"mariadb-client-5-default\" (UID: \"dafae949-909b-4ac8-877d-6b5a26c647e2\") " pod="openstack/mariadb-client-5-default" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.830325 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4zlm\" (UniqueName: \"kubernetes.io/projected/dafae949-909b-4ac8-877d-6b5a26c647e2-kube-api-access-r4zlm\") pod \"mariadb-client-5-default\" (UID: \"dafae949-909b-4ac8-877d-6b5a26c647e2\") " pod="openstack/mariadb-client-5-default" Dec 16 08:07:44 crc kubenswrapper[4789]: I1216 08:07:44.850848 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4zlm\" (UniqueName: \"kubernetes.io/projected/dafae949-909b-4ac8-877d-6b5a26c647e2-kube-api-access-r4zlm\") pod \"mariadb-client-5-default\" (UID: \"dafae949-909b-4ac8-877d-6b5a26c647e2\") " pod="openstack/mariadb-client-5-default" Dec 16 08:07:45 crc kubenswrapper[4789]: I1216 08:07:45.014151 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:07:45 crc kubenswrapper[4789]: I1216 08:07:45.519556 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:07:45 crc kubenswrapper[4789]: I1216 08:07:45.890226 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"dafae949-909b-4ac8-877d-6b5a26c647e2","Type":"ContainerStarted","Data":"ea39b5aeae7cda5cfa6e6c97da01a8bbc4a7191fc48c089eb691e131df63fa03"} Dec 16 08:07:46 crc kubenswrapper[4789]: I1216 08:07:46.105034 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:07:46 crc kubenswrapper[4789]: E1216 08:07:46.105255 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:07:46 crc kubenswrapper[4789]: I1216 08:07:46.908463 4789 generic.go:334] "Generic (PLEG): container finished" podID="dafae949-909b-4ac8-877d-6b5a26c647e2" containerID="d18e14e0d92a6aaf0abf0a6ea5ccfbe8c1be6464e2b2ebabbea19eb58653e0dd" exitCode=0 Dec 16 08:07:46 crc kubenswrapper[4789]: I1216 08:07:46.908524 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"dafae949-909b-4ac8-877d-6b5a26c647e2","Type":"ContainerDied","Data":"d18e14e0d92a6aaf0abf0a6ea5ccfbe8c1be6464e2b2ebabbea19eb58653e0dd"} Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.328717 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.353748 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_dafae949-909b-4ac8-877d-6b5a26c647e2/mariadb-client-5-default/0.log" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.384215 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.389277 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.489619 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4zlm\" (UniqueName: \"kubernetes.io/projected/dafae949-909b-4ac8-877d-6b5a26c647e2-kube-api-access-r4zlm\") pod \"dafae949-909b-4ac8-877d-6b5a26c647e2\" (UID: \"dafae949-909b-4ac8-877d-6b5a26c647e2\") " Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.495452 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dafae949-909b-4ac8-877d-6b5a26c647e2-kube-api-access-r4zlm" (OuterVolumeSpecName: "kube-api-access-r4zlm") pod "dafae949-909b-4ac8-877d-6b5a26c647e2" (UID: "dafae949-909b-4ac8-877d-6b5a26c647e2"). InnerVolumeSpecName "kube-api-access-r4zlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.535808 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:07:48 crc kubenswrapper[4789]: E1216 08:07:48.536285 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dafae949-909b-4ac8-877d-6b5a26c647e2" containerName="mariadb-client-5-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.536316 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafae949-909b-4ac8-877d-6b5a26c647e2" containerName="mariadb-client-5-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.536505 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="dafae949-909b-4ac8-877d-6b5a26c647e2" containerName="mariadb-client-5-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.537149 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.544308 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.591369 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfz7x\" (UniqueName: \"kubernetes.io/projected/0a158cf2-f83c-4d68-9a3b-4e2e9520974b-kube-api-access-vfz7x\") pod \"mariadb-client-6-default\" (UID: \"0a158cf2-f83c-4d68-9a3b-4e2e9520974b\") " pod="openstack/mariadb-client-6-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.592041 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4zlm\" (UniqueName: \"kubernetes.io/projected/dafae949-909b-4ac8-877d-6b5a26c647e2-kube-api-access-r4zlm\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.693986 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfz7x\" (UniqueName: \"kubernetes.io/projected/0a158cf2-f83c-4d68-9a3b-4e2e9520974b-kube-api-access-vfz7x\") pod \"mariadb-client-6-default\" (UID: \"0a158cf2-f83c-4d68-9a3b-4e2e9520974b\") " pod="openstack/mariadb-client-6-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.712487 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfz7x\" (UniqueName: \"kubernetes.io/projected/0a158cf2-f83c-4d68-9a3b-4e2e9520974b-kube-api-access-vfz7x\") pod \"mariadb-client-6-default\" (UID: \"0a158cf2-f83c-4d68-9a3b-4e2e9520974b\") " pod="openstack/mariadb-client-6-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.862116 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.928544 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea39b5aeae7cda5cfa6e6c97da01a8bbc4a7191fc48c089eb691e131df63fa03" Dec 16 08:07:48 crc kubenswrapper[4789]: I1216 08:07:48.928617 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 16 08:07:49 crc kubenswrapper[4789]: W1216 08:07:49.346237 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a158cf2_f83c_4d68_9a3b_4e2e9520974b.slice/crio-5cb6ad7d0e8199585a35dea6f4ff43da0978c058fb88780b4a581c91533d7bbd WatchSource:0}: Error finding container 5cb6ad7d0e8199585a35dea6f4ff43da0978c058fb88780b4a581c91533d7bbd: Status 404 returned error can't find the container with id 5cb6ad7d0e8199585a35dea6f4ff43da0978c058fb88780b4a581c91533d7bbd Dec 16 08:07:49 crc kubenswrapper[4789]: I1216 08:07:49.349109 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:07:49 crc kubenswrapper[4789]: I1216 08:07:49.938513 4789 generic.go:334] "Generic (PLEG): container finished" podID="0a158cf2-f83c-4d68-9a3b-4e2e9520974b" containerID="af3f62284f3a19d1e48fb5402a6202fda7a7120f9299520571784945a4590110" exitCode=1 Dec 16 08:07:49 crc kubenswrapper[4789]: I1216 08:07:49.938560 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"0a158cf2-f83c-4d68-9a3b-4e2e9520974b","Type":"ContainerDied","Data":"af3f62284f3a19d1e48fb5402a6202fda7a7120f9299520571784945a4590110"} Dec 16 08:07:49 crc kubenswrapper[4789]: I1216 08:07:49.938593 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"0a158cf2-f83c-4d68-9a3b-4e2e9520974b","Type":"ContainerStarted","Data":"5cb6ad7d0e8199585a35dea6f4ff43da0978c058fb88780b4a581c91533d7bbd"} Dec 16 08:07:50 crc kubenswrapper[4789]: I1216 08:07:50.113819 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafae949-909b-4ac8-877d-6b5a26c647e2" path="/var/lib/kubelet/pods/dafae949-909b-4ac8-877d-6b5a26c647e2/volumes" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.363941 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.380693 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_0a158cf2-f83c-4d68-9a3b-4e2e9520974b/mariadb-client-6-default/0.log" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.407069 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.413666 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.533266 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:07:51 crc kubenswrapper[4789]: E1216 08:07:51.534300 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a158cf2-f83c-4d68-9a3b-4e2e9520974b" containerName="mariadb-client-6-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.534330 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a158cf2-f83c-4d68-9a3b-4e2e9520974b" containerName="mariadb-client-6-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.534485 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a158cf2-f83c-4d68-9a3b-4e2e9520974b" containerName="mariadb-client-6-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.535037 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.537955 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfz7x\" (UniqueName: \"kubernetes.io/projected/0a158cf2-f83c-4d68-9a3b-4e2e9520974b-kube-api-access-vfz7x\") pod \"0a158cf2-f83c-4d68-9a3b-4e2e9520974b\" (UID: \"0a158cf2-f83c-4d68-9a3b-4e2e9520974b\") " Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.546245 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a158cf2-f83c-4d68-9a3b-4e2e9520974b-kube-api-access-vfz7x" (OuterVolumeSpecName: "kube-api-access-vfz7x") pod "0a158cf2-f83c-4d68-9a3b-4e2e9520974b" (UID: "0a158cf2-f83c-4d68-9a3b-4e2e9520974b"). InnerVolumeSpecName "kube-api-access-vfz7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.547038 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.639943 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drvg\" (UniqueName: \"kubernetes.io/projected/6c1ea7e1-d4f3-4ef7-86ad-713097103e79-kube-api-access-9drvg\") pod \"mariadb-client-7-default\" (UID: \"6c1ea7e1-d4f3-4ef7-86ad-713097103e79\") " pod="openstack/mariadb-client-7-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.640196 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfz7x\" (UniqueName: \"kubernetes.io/projected/0a158cf2-f83c-4d68-9a3b-4e2e9520974b-kube-api-access-vfz7x\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.741315 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drvg\" (UniqueName: \"kubernetes.io/projected/6c1ea7e1-d4f3-4ef7-86ad-713097103e79-kube-api-access-9drvg\") pod \"mariadb-client-7-default\" (UID: \"6c1ea7e1-d4f3-4ef7-86ad-713097103e79\") " pod="openstack/mariadb-client-7-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.758586 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drvg\" (UniqueName: \"kubernetes.io/projected/6c1ea7e1-d4f3-4ef7-86ad-713097103e79-kube-api-access-9drvg\") pod \"mariadb-client-7-default\" (UID: \"6c1ea7e1-d4f3-4ef7-86ad-713097103e79\") " pod="openstack/mariadb-client-7-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.877836 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.957131 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cb6ad7d0e8199585a35dea6f4ff43da0978c058fb88780b4a581c91533d7bbd" Dec 16 08:07:51 crc kubenswrapper[4789]: I1216 08:07:51.957187 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 16 08:07:52 crc kubenswrapper[4789]: I1216 08:07:52.115474 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a158cf2-f83c-4d68-9a3b-4e2e9520974b" path="/var/lib/kubelet/pods/0a158cf2-f83c-4d68-9a3b-4e2e9520974b/volumes" Dec 16 08:07:52 crc kubenswrapper[4789]: I1216 08:07:52.400069 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:07:52 crc kubenswrapper[4789]: W1216 08:07:52.419400 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c1ea7e1_d4f3_4ef7_86ad_713097103e79.slice/crio-ad8d49245feac94c0bfd9e6e56b2de0bc4930f491c085e166a339938e80f45f5 WatchSource:0}: Error finding container ad8d49245feac94c0bfd9e6e56b2de0bc4930f491c085e166a339938e80f45f5: Status 404 returned error can't find the container with id ad8d49245feac94c0bfd9e6e56b2de0bc4930f491c085e166a339938e80f45f5 Dec 16 08:07:52 crc kubenswrapper[4789]: I1216 08:07:52.968511 4789 generic.go:334] "Generic (PLEG): container finished" podID="6c1ea7e1-d4f3-4ef7-86ad-713097103e79" containerID="8923449c0a73f49cd36d602d2e8c9e491993983cbb5f00138067d76fe91236a7" exitCode=0 Dec 16 08:07:52 crc kubenswrapper[4789]: I1216 08:07:52.968648 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"6c1ea7e1-d4f3-4ef7-86ad-713097103e79","Type":"ContainerDied","Data":"8923449c0a73f49cd36d602d2e8c9e491993983cbb5f00138067d76fe91236a7"} Dec 16 08:07:52 crc kubenswrapper[4789]: I1216 08:07:52.968838 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"6c1ea7e1-d4f3-4ef7-86ad-713097103e79","Type":"ContainerStarted","Data":"ad8d49245feac94c0bfd9e6e56b2de0bc4930f491c085e166a339938e80f45f5"} Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.380539 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.400576 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_6c1ea7e1-d4f3-4ef7-86ad-713097103e79/mariadb-client-7-default/0.log" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.426827 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.435787 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.483090 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9drvg\" (UniqueName: \"kubernetes.io/projected/6c1ea7e1-d4f3-4ef7-86ad-713097103e79-kube-api-access-9drvg\") pod \"6c1ea7e1-d4f3-4ef7-86ad-713097103e79\" (UID: \"6c1ea7e1-d4f3-4ef7-86ad-713097103e79\") " Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.582559 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:07:54 crc kubenswrapper[4789]: E1216 08:07:54.582858 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1ea7e1-d4f3-4ef7-86ad-713097103e79" containerName="mariadb-client-7-default" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.582875 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1ea7e1-d4f3-4ef7-86ad-713097103e79" containerName="mariadb-client-7-default" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.583042 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1ea7e1-d4f3-4ef7-86ad-713097103e79" containerName="mariadb-client-7-default" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.584253 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.591128 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.686055 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4lvv\" (UniqueName: \"kubernetes.io/projected/60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d-kube-api-access-g4lvv\") pod \"mariadb-client-2\" (UID: \"60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d\") " pod="openstack/mariadb-client-2" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.787449 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4lvv\" (UniqueName: \"kubernetes.io/projected/60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d-kube-api-access-g4lvv\") pod \"mariadb-client-2\" (UID: \"60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d\") " pod="openstack/mariadb-client-2" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.866102 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1ea7e1-d4f3-4ef7-86ad-713097103e79-kube-api-access-9drvg" (OuterVolumeSpecName: "kube-api-access-9drvg") pod "6c1ea7e1-d4f3-4ef7-86ad-713097103e79" (UID: "6c1ea7e1-d4f3-4ef7-86ad-713097103e79"). InnerVolumeSpecName "kube-api-access-9drvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.868067 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4lvv\" (UniqueName: \"kubernetes.io/projected/60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d-kube-api-access-g4lvv\") pod \"mariadb-client-2\" (UID: \"60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d\") " pod="openstack/mariadb-client-2" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.888858 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9drvg\" (UniqueName: \"kubernetes.io/projected/6c1ea7e1-d4f3-4ef7-86ad-713097103e79-kube-api-access-9drvg\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.901449 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.986103 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad8d49245feac94c0bfd9e6e56b2de0bc4930f491c085e166a339938e80f45f5" Dec 16 08:07:54 crc kubenswrapper[4789]: I1216 08:07:54.986160 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 16 08:07:55 crc kubenswrapper[4789]: I1216 08:07:55.429617 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:07:55 crc kubenswrapper[4789]: W1216 08:07:55.436072 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60bc1f3b_dbdf_4b44_81a1_bf0ea3c9201d.slice/crio-fb4002727b2425d0e40e95617ce7ed1280b066af9c74d791938d33a556ead13c WatchSource:0}: Error finding container fb4002727b2425d0e40e95617ce7ed1280b066af9c74d791938d33a556ead13c: Status 404 returned error can't find the container with id fb4002727b2425d0e40e95617ce7ed1280b066af9c74d791938d33a556ead13c Dec 16 08:07:55 crc kubenswrapper[4789]: I1216 08:07:55.996953 4789 generic.go:334] "Generic (PLEG): container finished" podID="60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d" containerID="566b20d72aade5853918dcad07c3f1c50e2aa0d0d6e8337487a50cfde3f0a6db" exitCode=0 Dec 16 08:07:55 crc kubenswrapper[4789]: I1216 08:07:55.997069 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d","Type":"ContainerDied","Data":"566b20d72aade5853918dcad07c3f1c50e2aa0d0d6e8337487a50cfde3f0a6db"} Dec 16 08:07:55 crc kubenswrapper[4789]: I1216 08:07:55.997552 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d","Type":"ContainerStarted","Data":"fb4002727b2425d0e40e95617ce7ed1280b066af9c74d791938d33a556ead13c"} Dec 16 08:07:56 crc kubenswrapper[4789]: I1216 08:07:56.117098 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1ea7e1-d4f3-4ef7-86ad-713097103e79" path="/var/lib/kubelet/pods/6c1ea7e1-d4f3-4ef7-86ad-713097103e79/volumes" Dec 16 08:07:57 crc kubenswrapper[4789]: I1216 08:07:57.403331 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:07:57 crc kubenswrapper[4789]: I1216 08:07:57.421571 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d/mariadb-client-2/0.log" Dec 16 08:07:57 crc kubenswrapper[4789]: I1216 08:07:57.448257 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:07:57 crc kubenswrapper[4789]: I1216 08:07:57.453536 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 16 08:07:57 crc kubenswrapper[4789]: I1216 08:07:57.535675 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4lvv\" (UniqueName: \"kubernetes.io/projected/60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d-kube-api-access-g4lvv\") pod \"60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d\" (UID: \"60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d\") " Dec 16 08:07:57 crc kubenswrapper[4789]: I1216 08:07:57.542296 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d-kube-api-access-g4lvv" (OuterVolumeSpecName: "kube-api-access-g4lvv") pod "60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d" (UID: "60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d"). InnerVolumeSpecName "kube-api-access-g4lvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:07:57 crc kubenswrapper[4789]: I1216 08:07:57.638217 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4lvv\" (UniqueName: \"kubernetes.io/projected/60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d-kube-api-access-g4lvv\") on node \"crc\" DevicePath \"\"" Dec 16 08:07:58 crc kubenswrapper[4789]: I1216 08:07:58.013622 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb4002727b2425d0e40e95617ce7ed1280b066af9c74d791938d33a556ead13c" Dec 16 08:07:58 crc kubenswrapper[4789]: I1216 08:07:58.013670 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 16 08:07:58 crc kubenswrapper[4789]: I1216 08:07:58.113282 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d" path="/var/lib/kubelet/pods/60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d/volumes" Dec 16 08:07:59 crc kubenswrapper[4789]: I1216 08:07:59.104621 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:07:59 crc kubenswrapper[4789]: E1216 08:07:59.105186 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:08:14 crc kubenswrapper[4789]: I1216 08:08:14.105250 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:08:14 crc kubenswrapper[4789]: E1216 08:08:14.106001 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:08:25 crc kubenswrapper[4789]: I1216 08:08:25.105324 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:08:25 crc kubenswrapper[4789]: E1216 08:08:25.105997 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:08:29 crc kubenswrapper[4789]: I1216 08:08:29.190464 4789 scope.go:117] "RemoveContainer" containerID="de1777f35536cdd7e78f94fa01ece14c847a47ed7a4f967c67f80f8b37023539" Dec 16 08:08:38 crc kubenswrapper[4789]: I1216 08:08:38.104500 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:08:38 crc kubenswrapper[4789]: E1216 08:08:38.105275 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:08:50 crc kubenswrapper[4789]: I1216 08:08:50.105041 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:08:50 crc kubenswrapper[4789]: E1216 08:08:50.106160 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:09:01 crc kubenswrapper[4789]: I1216 08:09:01.104832 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:09:01 crc kubenswrapper[4789]: E1216 08:09:01.105474 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:09:12 crc kubenswrapper[4789]: I1216 08:09:12.109651 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:09:12 crc kubenswrapper[4789]: E1216 08:09:12.110475 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:09:25 crc kubenswrapper[4789]: I1216 08:09:25.105649 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:09:25 crc kubenswrapper[4789]: E1216 08:09:25.106424 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:09:40 crc kubenswrapper[4789]: I1216 08:09:40.104777 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:09:40 crc kubenswrapper[4789]: E1216 08:09:40.105494 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.090471 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pcgcf"] Dec 16 08:09:44 crc kubenswrapper[4789]: E1216 08:09:44.091774 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d" containerName="mariadb-client-2" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.091791 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d" containerName="mariadb-client-2" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.091989 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bc1f3b-dbdf-4b44-81a1-bf0ea3c9201d" containerName="mariadb-client-2" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.094130 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.130361 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pcgcf"] Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.201934 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lflt\" (UniqueName: \"kubernetes.io/projected/cb4f2893-0708-4a59-b523-d4847a0d2a3d-kube-api-access-7lflt\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.202155 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-catalog-content\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.202175 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-utilities\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.303782 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lflt\" (UniqueName: \"kubernetes.io/projected/cb4f2893-0708-4a59-b523-d4847a0d2a3d-kube-api-access-7lflt\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.303885 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-catalog-content\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.303904 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-utilities\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.304305 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-utilities\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.304505 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-catalog-content\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.322824 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lflt\" (UniqueName: \"kubernetes.io/projected/cb4f2893-0708-4a59-b523-d4847a0d2a3d-kube-api-access-7lflt\") pod \"redhat-operators-pcgcf\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.414576 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.730489 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pcgcf"] Dec 16 08:09:44 crc kubenswrapper[4789]: I1216 08:09:44.822793 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcgcf" event={"ID":"cb4f2893-0708-4a59-b523-d4847a0d2a3d","Type":"ContainerStarted","Data":"07aafaaa7b973b2faec9068bb80ba0228209f467e05f33de3bad0eb03f1cd10a"} Dec 16 08:09:45 crc kubenswrapper[4789]: I1216 08:09:45.832362 4789 generic.go:334] "Generic (PLEG): container finished" podID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerID="ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1" exitCode=0 Dec 16 08:09:45 crc kubenswrapper[4789]: I1216 08:09:45.832448 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcgcf" event={"ID":"cb4f2893-0708-4a59-b523-d4847a0d2a3d","Type":"ContainerDied","Data":"ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1"} Dec 16 08:09:45 crc kubenswrapper[4789]: I1216 08:09:45.834069 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:09:47 crc kubenswrapper[4789]: I1216 08:09:47.847407 4789 generic.go:334] "Generic (PLEG): container finished" podID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerID="c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d" exitCode=0 Dec 16 08:09:47 crc kubenswrapper[4789]: I1216 08:09:47.847483 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcgcf" event={"ID":"cb4f2893-0708-4a59-b523-d4847a0d2a3d","Type":"ContainerDied","Data":"c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d"} Dec 16 08:09:49 crc kubenswrapper[4789]: I1216 08:09:49.863900 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcgcf" event={"ID":"cb4f2893-0708-4a59-b523-d4847a0d2a3d","Type":"ContainerStarted","Data":"bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917"} Dec 16 08:09:49 crc kubenswrapper[4789]: I1216 08:09:49.884662 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pcgcf" podStartSLOduration=3.005884735 podStartE2EDuration="5.884641598s" podCreationTimestamp="2025-12-16 08:09:44 +0000 UTC" firstStartedPulling="2025-12-16 08:09:45.833802855 +0000 UTC m=+4724.095690484" lastFinishedPulling="2025-12-16 08:09:48.712559718 +0000 UTC m=+4726.974447347" observedRunningTime="2025-12-16 08:09:49.87774847 +0000 UTC m=+4728.139636099" watchObservedRunningTime="2025-12-16 08:09:49.884641598 +0000 UTC m=+4728.146529227" Dec 16 08:09:53 crc kubenswrapper[4789]: I1216 08:09:53.105949 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:09:53 crc kubenswrapper[4789]: E1216 08:09:53.106510 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:09:54 crc kubenswrapper[4789]: I1216 08:09:54.414992 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:54 crc kubenswrapper[4789]: I1216 08:09:54.415069 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:54 crc kubenswrapper[4789]: I1216 08:09:54.476853 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:54 crc kubenswrapper[4789]: I1216 08:09:54.934936 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:54 crc kubenswrapper[4789]: I1216 08:09:54.992070 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pcgcf"] Dec 16 08:09:56 crc kubenswrapper[4789]: I1216 08:09:56.914513 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pcgcf" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="registry-server" containerID="cri-o://bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917" gracePeriod=2 Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.593639 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.618481 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-catalog-content\") pod \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.618595 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-utilities\") pod \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.618745 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lflt\" (UniqueName: \"kubernetes.io/projected/cb4f2893-0708-4a59-b523-d4847a0d2a3d-kube-api-access-7lflt\") pod \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\" (UID: \"cb4f2893-0708-4a59-b523-d4847a0d2a3d\") " Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.619507 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-utilities" (OuterVolumeSpecName: "utilities") pod "cb4f2893-0708-4a59-b523-d4847a0d2a3d" (UID: "cb4f2893-0708-4a59-b523-d4847a0d2a3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.636727 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4f2893-0708-4a59-b523-d4847a0d2a3d-kube-api-access-7lflt" (OuterVolumeSpecName: "kube-api-access-7lflt") pod "cb4f2893-0708-4a59-b523-d4847a0d2a3d" (UID: "cb4f2893-0708-4a59-b523-d4847a0d2a3d"). InnerVolumeSpecName "kube-api-access-7lflt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.722184 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.722220 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lflt\" (UniqueName: \"kubernetes.io/projected/cb4f2893-0708-4a59-b523-d4847a0d2a3d-kube-api-access-7lflt\") on node \"crc\" DevicePath \"\"" Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.932258 4789 generic.go:334] "Generic (PLEG): container finished" podID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerID="bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917" exitCode=0 Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.932307 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcgcf" event={"ID":"cb4f2893-0708-4a59-b523-d4847a0d2a3d","Type":"ContainerDied","Data":"bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917"} Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.932342 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pcgcf" event={"ID":"cb4f2893-0708-4a59-b523-d4847a0d2a3d","Type":"ContainerDied","Data":"07aafaaa7b973b2faec9068bb80ba0228209f467e05f33de3bad0eb03f1cd10a"} Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.932354 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pcgcf" Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.932365 4789 scope.go:117] "RemoveContainer" containerID="bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917" Dec 16 08:09:58 crc kubenswrapper[4789]: I1216 08:09:58.952300 4789 scope.go:117] "RemoveContainer" containerID="c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.172174 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb4f2893-0708-4a59-b523-d4847a0d2a3d" (UID: "cb4f2893-0708-4a59-b523-d4847a0d2a3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.179096 4789 scope.go:117] "RemoveContainer" containerID="ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.203016 4789 scope.go:117] "RemoveContainer" containerID="bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917" Dec 16 08:09:59 crc kubenswrapper[4789]: E1216 08:09:59.203551 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917\": container with ID starting with bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917 not found: ID does not exist" containerID="bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.203602 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917"} err="failed to get container status \"bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917\": rpc error: code = NotFound desc = could not find container \"bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917\": container with ID starting with bcd89f9751a15a639fe23985c0fa8981e14804c9eb007c3e16218e107f7a0917 not found: ID does not exist" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.203629 4789 scope.go:117] "RemoveContainer" containerID="c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d" Dec 16 08:09:59 crc kubenswrapper[4789]: E1216 08:09:59.203965 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d\": container with ID starting with c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d not found: ID does not exist" containerID="c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.203997 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d"} err="failed to get container status \"c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d\": rpc error: code = NotFound desc = could not find container \"c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d\": container with ID starting with c3f314e9bd4103e666618734c92fe062af459dffaa5f7d63f280e6abcae9c25d not found: ID does not exist" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.204018 4789 scope.go:117] "RemoveContainer" containerID="ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1" Dec 16 08:09:59 crc kubenswrapper[4789]: E1216 08:09:59.204326 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1\": container with ID starting with ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1 not found: ID does not exist" containerID="ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.204351 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1"} err="failed to get container status \"ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1\": rpc error: code = NotFound desc = could not find container \"ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1\": container with ID starting with ec10c434fe2cbb7f41655e56b988aa1ddca6f1fd23c192233edefdb1a6cbb7b1 not found: ID does not exist" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.230628 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb4f2893-0708-4a59-b523-d4847a0d2a3d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.259838 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pcgcf"] Dec 16 08:09:59 crc kubenswrapper[4789]: I1216 08:09:59.265785 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pcgcf"] Dec 16 08:10:00 crc kubenswrapper[4789]: I1216 08:10:00.116045 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" path="/var/lib/kubelet/pods/cb4f2893-0708-4a59-b523-d4847a0d2a3d/volumes" Dec 16 08:10:07 crc kubenswrapper[4789]: I1216 08:10:07.104726 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:10:07 crc kubenswrapper[4789]: E1216 08:10:07.105554 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:10:18 crc kubenswrapper[4789]: I1216 08:10:18.105643 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:10:18 crc kubenswrapper[4789]: E1216 08:10:18.106949 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.808385 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 08:10:20 crc kubenswrapper[4789]: E1216 08:10:20.809355 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="registry-server" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.809446 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="registry-server" Dec 16 08:10:20 crc kubenswrapper[4789]: E1216 08:10:20.809608 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="extract-content" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.809678 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="extract-content" Dec 16 08:10:20 crc kubenswrapper[4789]: E1216 08:10:20.809765 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="extract-utilities" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.809831 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="extract-utilities" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.810081 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4f2893-0708-4a59-b523-d4847a0d2a3d" containerName="registry-server" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.810701 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.813630 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lsf8v" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.817216 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.868650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rlsh\" (UniqueName: \"kubernetes.io/projected/cbc80106-cc75-47d5-beaf-6d9c7c20d41e-kube-api-access-9rlsh\") pod \"mariadb-copy-data\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " pod="openstack/mariadb-copy-data" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.868700 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\") pod \"mariadb-copy-data\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " pod="openstack/mariadb-copy-data" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.970245 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rlsh\" (UniqueName: \"kubernetes.io/projected/cbc80106-cc75-47d5-beaf-6d9c7c20d41e-kube-api-access-9rlsh\") pod \"mariadb-copy-data\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " pod="openstack/mariadb-copy-data" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.970310 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\") pod \"mariadb-copy-data\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " pod="openstack/mariadb-copy-data" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.974106 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.974186 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\") pod \"mariadb-copy-data\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/60a7b0b73b966ea887eac91a5e02ee3ab829f360dab9728d2f1eeaee221ba012/globalmount\"" pod="openstack/mariadb-copy-data" Dec 16 08:10:20 crc kubenswrapper[4789]: I1216 08:10:20.997374 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rlsh\" (UniqueName: \"kubernetes.io/projected/cbc80106-cc75-47d5-beaf-6d9c7c20d41e-kube-api-access-9rlsh\") pod \"mariadb-copy-data\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " pod="openstack/mariadb-copy-data" Dec 16 08:10:21 crc kubenswrapper[4789]: I1216 08:10:21.007672 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\") pod \"mariadb-copy-data\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " pod="openstack/mariadb-copy-data" Dec 16 08:10:21 crc kubenswrapper[4789]: I1216 08:10:21.135861 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 08:10:21 crc kubenswrapper[4789]: I1216 08:10:21.636901 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 08:10:21 crc kubenswrapper[4789]: W1216 08:10:21.642806 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbc80106_cc75_47d5_beaf_6d9c7c20d41e.slice/crio-27e65e5a5c6929d7ba8f9391af75d2c494e2a6aed43189f741611ecfd601880f WatchSource:0}: Error finding container 27e65e5a5c6929d7ba8f9391af75d2c494e2a6aed43189f741611ecfd601880f: Status 404 returned error can't find the container with id 27e65e5a5c6929d7ba8f9391af75d2c494e2a6aed43189f741611ecfd601880f Dec 16 08:10:22 crc kubenswrapper[4789]: I1216 08:10:22.088285 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cbc80106-cc75-47d5-beaf-6d9c7c20d41e","Type":"ContainerStarted","Data":"a68068a499e399865abb8dafc2e1c76db2509b91ee4934d47c506e10c5b07e01"} Dec 16 08:10:22 crc kubenswrapper[4789]: I1216 08:10:22.088327 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cbc80106-cc75-47d5-beaf-6d9c7c20d41e","Type":"ContainerStarted","Data":"27e65e5a5c6929d7ba8f9391af75d2c494e2a6aed43189f741611ecfd601880f"} Dec 16 08:10:22 crc kubenswrapper[4789]: I1216 08:10:22.103864 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.103846968 podStartE2EDuration="3.103846968s" podCreationTimestamp="2025-12-16 08:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:10:22.101524161 +0000 UTC m=+4760.363411800" watchObservedRunningTime="2025-12-16 08:10:22.103846968 +0000 UTC m=+4760.365734597" Dec 16 08:10:24 crc kubenswrapper[4789]: I1216 08:10:24.978198 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:24 crc kubenswrapper[4789]: I1216 08:10:24.980538 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:24 crc kubenswrapper[4789]: I1216 08:10:24.989351 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:25 crc kubenswrapper[4789]: I1216 08:10:25.024186 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtjh\" (UniqueName: \"kubernetes.io/projected/99e81370-ec53-42cc-a343-4ef71a0d5d8e-kube-api-access-xxtjh\") pod \"mariadb-client\" (UID: \"99e81370-ec53-42cc-a343-4ef71a0d5d8e\") " pod="openstack/mariadb-client" Dec 16 08:10:25 crc kubenswrapper[4789]: I1216 08:10:25.125992 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtjh\" (UniqueName: \"kubernetes.io/projected/99e81370-ec53-42cc-a343-4ef71a0d5d8e-kube-api-access-xxtjh\") pod \"mariadb-client\" (UID: \"99e81370-ec53-42cc-a343-4ef71a0d5d8e\") " pod="openstack/mariadb-client" Dec 16 08:10:25 crc kubenswrapper[4789]: I1216 08:10:25.158524 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtjh\" (UniqueName: \"kubernetes.io/projected/99e81370-ec53-42cc-a343-4ef71a0d5d8e-kube-api-access-xxtjh\") pod \"mariadb-client\" (UID: \"99e81370-ec53-42cc-a343-4ef71a0d5d8e\") " pod="openstack/mariadb-client" Dec 16 08:10:25 crc kubenswrapper[4789]: I1216 08:10:25.312832 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:25 crc kubenswrapper[4789]: I1216 08:10:25.717480 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:25 crc kubenswrapper[4789]: W1216 08:10:25.720799 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99e81370_ec53_42cc_a343_4ef71a0d5d8e.slice/crio-5888c2aec0960c2114bfa8491a7786d03443ff35528a63344cdee040581c26a3 WatchSource:0}: Error finding container 5888c2aec0960c2114bfa8491a7786d03443ff35528a63344cdee040581c26a3: Status 404 returned error can't find the container with id 5888c2aec0960c2114bfa8491a7786d03443ff35528a63344cdee040581c26a3 Dec 16 08:10:26 crc kubenswrapper[4789]: I1216 08:10:26.126638 4789 generic.go:334] "Generic (PLEG): container finished" podID="99e81370-ec53-42cc-a343-4ef71a0d5d8e" containerID="efa667bbfd1521b1f3810a44ffddada97568787a51d6e16ea1e7b5b4ad09ccc7" exitCode=0 Dec 16 08:10:26 crc kubenswrapper[4789]: I1216 08:10:26.126678 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"99e81370-ec53-42cc-a343-4ef71a0d5d8e","Type":"ContainerDied","Data":"efa667bbfd1521b1f3810a44ffddada97568787a51d6e16ea1e7b5b4ad09ccc7"} Dec 16 08:10:26 crc kubenswrapper[4789]: I1216 08:10:26.126716 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"99e81370-ec53-42cc-a343-4ef71a0d5d8e","Type":"ContainerStarted","Data":"5888c2aec0960c2114bfa8491a7786d03443ff35528a63344cdee040581c26a3"} Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.509626 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.528047 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_99e81370-ec53-42cc-a343-4ef71a0d5d8e/mariadb-client/0.log" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.553011 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.557605 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.565724 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxtjh\" (UniqueName: \"kubernetes.io/projected/99e81370-ec53-42cc-a343-4ef71a0d5d8e-kube-api-access-xxtjh\") pod \"99e81370-ec53-42cc-a343-4ef71a0d5d8e\" (UID: \"99e81370-ec53-42cc-a343-4ef71a0d5d8e\") " Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.570953 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e81370-ec53-42cc-a343-4ef71a0d5d8e-kube-api-access-xxtjh" (OuterVolumeSpecName: "kube-api-access-xxtjh") pod "99e81370-ec53-42cc-a343-4ef71a0d5d8e" (UID: "99e81370-ec53-42cc-a343-4ef71a0d5d8e"). InnerVolumeSpecName "kube-api-access-xxtjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.666901 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxtjh\" (UniqueName: \"kubernetes.io/projected/99e81370-ec53-42cc-a343-4ef71a0d5d8e-kube-api-access-xxtjh\") on node \"crc\" DevicePath \"\"" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.707436 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:27 crc kubenswrapper[4789]: E1216 08:10:27.707782 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e81370-ec53-42cc-a343-4ef71a0d5d8e" containerName="mariadb-client" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.707812 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e81370-ec53-42cc-a343-4ef71a0d5d8e" containerName="mariadb-client" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.708019 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e81370-ec53-42cc-a343-4ef71a0d5d8e" containerName="mariadb-client" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.708609 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.713990 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.768219 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lshbl\" (UniqueName: \"kubernetes.io/projected/f230f626-f0b6-4b6a-85be-5e143b7e5d82-kube-api-access-lshbl\") pod \"mariadb-client\" (UID: \"f230f626-f0b6-4b6a-85be-5e143b7e5d82\") " pod="openstack/mariadb-client" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.869664 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lshbl\" (UniqueName: \"kubernetes.io/projected/f230f626-f0b6-4b6a-85be-5e143b7e5d82-kube-api-access-lshbl\") pod \"mariadb-client\" (UID: \"f230f626-f0b6-4b6a-85be-5e143b7e5d82\") " pod="openstack/mariadb-client" Dec 16 08:10:27 crc kubenswrapper[4789]: I1216 08:10:27.887381 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lshbl\" (UniqueName: \"kubernetes.io/projected/f230f626-f0b6-4b6a-85be-5e143b7e5d82-kube-api-access-lshbl\") pod \"mariadb-client\" (UID: \"f230f626-f0b6-4b6a-85be-5e143b7e5d82\") " pod="openstack/mariadb-client" Dec 16 08:10:28 crc kubenswrapper[4789]: I1216 08:10:28.038997 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:28 crc kubenswrapper[4789]: I1216 08:10:28.118679 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e81370-ec53-42cc-a343-4ef71a0d5d8e" path="/var/lib/kubelet/pods/99e81370-ec53-42cc-a343-4ef71a0d5d8e/volumes" Dec 16 08:10:28 crc kubenswrapper[4789]: I1216 08:10:28.145101 4789 scope.go:117] "RemoveContainer" containerID="efa667bbfd1521b1f3810a44ffddada97568787a51d6e16ea1e7b5b4ad09ccc7" Dec 16 08:10:28 crc kubenswrapper[4789]: I1216 08:10:28.145157 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:28 crc kubenswrapper[4789]: W1216 08:10:28.456146 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf230f626_f0b6_4b6a_85be_5e143b7e5d82.slice/crio-932fdafd0bcd29fcd9ba92d69cada5ceba0f5296744b1a4ef30cbd1cb225b73c WatchSource:0}: Error finding container 932fdafd0bcd29fcd9ba92d69cada5ceba0f5296744b1a4ef30cbd1cb225b73c: Status 404 returned error can't find the container with id 932fdafd0bcd29fcd9ba92d69cada5ceba0f5296744b1a4ef30cbd1cb225b73c Dec 16 08:10:28 crc kubenswrapper[4789]: I1216 08:10:28.459451 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:29 crc kubenswrapper[4789]: I1216 08:10:29.160258 4789 generic.go:334] "Generic (PLEG): container finished" podID="f230f626-f0b6-4b6a-85be-5e143b7e5d82" containerID="81a5aadff1a0aa2385fad4d2d2f4e2ac647c997814200aaa34bd24ec9095f2ad" exitCode=0 Dec 16 08:10:29 crc kubenswrapper[4789]: I1216 08:10:29.161449 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f230f626-f0b6-4b6a-85be-5e143b7e5d82","Type":"ContainerDied","Data":"81a5aadff1a0aa2385fad4d2d2f4e2ac647c997814200aaa34bd24ec9095f2ad"} Dec 16 08:10:29 crc kubenswrapper[4789]: I1216 08:10:29.161887 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"f230f626-f0b6-4b6a-85be-5e143b7e5d82","Type":"ContainerStarted","Data":"932fdafd0bcd29fcd9ba92d69cada5ceba0f5296744b1a4ef30cbd1cb225b73c"} Dec 16 08:10:30 crc kubenswrapper[4789]: I1216 08:10:30.439662 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:30 crc kubenswrapper[4789]: I1216 08:10:30.457832 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_f230f626-f0b6-4b6a-85be-5e143b7e5d82/mariadb-client/0.log" Dec 16 08:10:30 crc kubenswrapper[4789]: I1216 08:10:30.508385 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lshbl\" (UniqueName: \"kubernetes.io/projected/f230f626-f0b6-4b6a-85be-5e143b7e5d82-kube-api-access-lshbl\") pod \"f230f626-f0b6-4b6a-85be-5e143b7e5d82\" (UID: \"f230f626-f0b6-4b6a-85be-5e143b7e5d82\") " Dec 16 08:10:30 crc kubenswrapper[4789]: I1216 08:10:30.514821 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f230f626-f0b6-4b6a-85be-5e143b7e5d82-kube-api-access-lshbl" (OuterVolumeSpecName: "kube-api-access-lshbl") pod "f230f626-f0b6-4b6a-85be-5e143b7e5d82" (UID: "f230f626-f0b6-4b6a-85be-5e143b7e5d82"). InnerVolumeSpecName "kube-api-access-lshbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:10:30 crc kubenswrapper[4789]: I1216 08:10:30.518690 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:30 crc kubenswrapper[4789]: I1216 08:10:30.524087 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 16 08:10:30 crc kubenswrapper[4789]: I1216 08:10:30.610644 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lshbl\" (UniqueName: \"kubernetes.io/projected/f230f626-f0b6-4b6a-85be-5e143b7e5d82-kube-api-access-lshbl\") on node \"crc\" DevicePath \"\"" Dec 16 08:10:31 crc kubenswrapper[4789]: I1216 08:10:31.104714 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:10:31 crc kubenswrapper[4789]: E1216 08:10:31.105016 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:10:31 crc kubenswrapper[4789]: I1216 08:10:31.176234 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="932fdafd0bcd29fcd9ba92d69cada5ceba0f5296744b1a4ef30cbd1cb225b73c" Dec 16 08:10:31 crc kubenswrapper[4789]: I1216 08:10:31.176312 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 16 08:10:32 crc kubenswrapper[4789]: I1216 08:10:32.113791 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f230f626-f0b6-4b6a-85be-5e143b7e5d82" path="/var/lib/kubelet/pods/f230f626-f0b6-4b6a-85be-5e143b7e5d82/volumes" Dec 16 08:10:46 crc kubenswrapper[4789]: I1216 08:10:46.105278 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:10:46 crc kubenswrapper[4789]: E1216 08:10:46.106213 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:10:57 crc kubenswrapper[4789]: I1216 08:10:57.034015 4789 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ppm2p container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 08:10:57 crc kubenswrapper[4789]: I1216 08:10:57.036259 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" podUID="90244cab-89b7-4109-b673-a7cd881ae0a4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:10:57 crc kubenswrapper[4789]: I1216 08:10:57.044204 4789 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ppm2p container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 08:10:57 crc kubenswrapper[4789]: I1216 08:10:57.044272 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ppm2p" podUID="90244cab-89b7-4109-b673-a7cd881ae0a4" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.104567 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:10:59 crc kubenswrapper[4789]: E1216 08:10:59.105313 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.398229 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 08:10:59 crc kubenswrapper[4789]: E1216 08:10:59.398537 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f230f626-f0b6-4b6a-85be-5e143b7e5d82" containerName="mariadb-client" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.398548 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f230f626-f0b6-4b6a-85be-5e143b7e5d82" containerName="mariadb-client" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.398687 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f230f626-f0b6-4b6a-85be-5e143b7e5d82" containerName="mariadb-client" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.399418 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.403831 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.404155 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nd9fv" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.404274 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.414976 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.425466 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.427496 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.438766 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.440068 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.453896 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.464720 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597078 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e439248-3fed-4550-9d89-8fec7e155a09-config\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597159 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8fl\" (UniqueName: \"kubernetes.io/projected/a51d9137-cd54-4a8d-8217-cebbf247f188-kube-api-access-cq8fl\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597214 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597257 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ce8208-1afd-4f72-bda9-cdb9017e3e51-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597299 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rql8x\" (UniqueName: \"kubernetes.io/projected/0e439248-3fed-4550-9d89-8fec7e155a09-kube-api-access-rql8x\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597328 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a51d9137-cd54-4a8d-8217-cebbf247f188-config\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597350 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e439248-3fed-4550-9d89-8fec7e155a09-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597374 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ce8208-1afd-4f72-bda9-cdb9017e3e51-config\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597398 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597421 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a51d9137-cd54-4a8d-8217-cebbf247f188-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597444 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a51d9137-cd54-4a8d-8217-cebbf247f188-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597472 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce8208-1afd-4f72-bda9-cdb9017e3e51-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597502 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e439248-3fed-4550-9d89-8fec7e155a09-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597527 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a51d9137-cd54-4a8d-8217-cebbf247f188-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597559 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkg8g\" (UniqueName: \"kubernetes.io/projected/33ce8208-1afd-4f72-bda9-cdb9017e3e51-kube-api-access-vkg8g\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597607 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e439248-3fed-4550-9d89-8fec7e155a09-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597631 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5868f05c-e195-436b-b0a8-d72393b48b53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5868f05c-e195-436b-b0a8-d72393b48b53\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.597661 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33ce8208-1afd-4f72-bda9-cdb9017e3e51-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.612645 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.614171 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.624620 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.625302 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dqjts" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.626740 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.629378 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.630119 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.641856 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.644175 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.645488 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.653161 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.660404 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.698685 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e439248-3fed-4550-9d89-8fec7e155a09-config\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.698755 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8fl\" (UniqueName: \"kubernetes.io/projected/a51d9137-cd54-4a8d-8217-cebbf247f188-kube-api-access-cq8fl\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.698788 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.698821 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ce8208-1afd-4f72-bda9-cdb9017e3e51-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.698859 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rql8x\" (UniqueName: \"kubernetes.io/projected/0e439248-3fed-4550-9d89-8fec7e155a09-kube-api-access-rql8x\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699177 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a51d9137-cd54-4a8d-8217-cebbf247f188-config\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699213 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ce8208-1afd-4f72-bda9-cdb9017e3e51-config\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699239 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e439248-3fed-4550-9d89-8fec7e155a09-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699265 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699295 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a51d9137-cd54-4a8d-8217-cebbf247f188-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699317 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a51d9137-cd54-4a8d-8217-cebbf247f188-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699342 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce8208-1afd-4f72-bda9-cdb9017e3e51-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699374 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e439248-3fed-4550-9d89-8fec7e155a09-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699402 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a51d9137-cd54-4a8d-8217-cebbf247f188-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699437 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkg8g\" (UniqueName: \"kubernetes.io/projected/33ce8208-1afd-4f72-bda9-cdb9017e3e51-kube-api-access-vkg8g\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699488 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e439248-3fed-4550-9d89-8fec7e155a09-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699519 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5868f05c-e195-436b-b0a8-d72393b48b53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5868f05c-e195-436b-b0a8-d72393b48b53\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699551 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33ce8208-1afd-4f72-bda9-cdb9017e3e51-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.699688 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e439248-3fed-4550-9d89-8fec7e155a09-config\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.700066 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33ce8208-1afd-4f72-bda9-cdb9017e3e51-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.701402 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ce8208-1afd-4f72-bda9-cdb9017e3e51-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.702416 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e439248-3fed-4550-9d89-8fec7e155a09-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.702681 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0e439248-3fed-4550-9d89-8fec7e155a09-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.703093 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a51d9137-cd54-4a8d-8217-cebbf247f188-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.704734 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a51d9137-cd54-4a8d-8217-cebbf247f188-config\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.705233 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a51d9137-cd54-4a8d-8217-cebbf247f188-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.705395 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ce8208-1afd-4f72-bda9-cdb9017e3e51-config\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.706204 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.706217 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.706233 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.706242 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/98f7ffe1621181cf7eeeaca0d8cecc217d56cfae6fbc42e3f1d4ec981bc7c9d0/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.706273 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/03efcd70eab7f7acb7a479e24e4e52e24ccb906aedc8bc6c97e958193ede0736/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.706231 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5868f05c-e195-436b-b0a8-d72393b48b53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5868f05c-e195-436b-b0a8-d72393b48b53\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ab61de79aad82212a4a3f54f74ba327f7a2a6c474552360f9387a2c8e3f94e16/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.722351 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a51d9137-cd54-4a8d-8217-cebbf247f188-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.725312 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rql8x\" (UniqueName: \"kubernetes.io/projected/0e439248-3fed-4550-9d89-8fec7e155a09-kube-api-access-rql8x\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.725384 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce8208-1afd-4f72-bda9-cdb9017e3e51-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.725753 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8fl\" (UniqueName: \"kubernetes.io/projected/a51d9137-cd54-4a8d-8217-cebbf247f188-kube-api-access-cq8fl\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.725906 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e439248-3fed-4550-9d89-8fec7e155a09-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.726986 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkg8g\" (UniqueName: \"kubernetes.io/projected/33ce8208-1afd-4f72-bda9-cdb9017e3e51-kube-api-access-vkg8g\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.758986 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9523254-4f95-49d3-b2b9-c6f1dbd516f2\") pod \"ovsdbserver-nb-2\" (UID: \"a51d9137-cd54-4a8d-8217-cebbf247f188\") " pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.766385 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b771b429-13b3-4d85-8ffc-dee1d1712484\") pod \"ovsdbserver-nb-0\" (UID: \"33ce8208-1afd-4f72-bda9-cdb9017e3e51\") " pod="openstack/ovsdbserver-nb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.766729 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.768334 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5868f05c-e195-436b-b0a8-d72393b48b53\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5868f05c-e195-436b-b0a8-d72393b48b53\") pod \"ovsdbserver-nb-1\" (UID: \"0e439248-3fed-4550-9d89-8fec7e155a09\") " pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.777645 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801375 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801427 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-config\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801447 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/042af17b-3a9b-43a0-b270-f52582835b5a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801473 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042af17b-3a9b-43a0-b270-f52582835b5a-config\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801494 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801517 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc8pw\" (UniqueName: \"kubernetes.io/projected/042af17b-3a9b-43a0-b270-f52582835b5a-kube-api-access-xc8pw\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801537 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76507e71-1eee-4984-9c27-631ab3a139f3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801552 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6d6\" (UniqueName: \"kubernetes.io/projected/76507e71-1eee-4984-9c27-631ab3a139f3-kube-api-access-8m6d6\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801568 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/042af17b-3a9b-43a0-b270-f52582835b5a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801591 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801612 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042af17b-3a9b-43a0-b270-f52582835b5a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801626 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76507e71-1eee-4984-9c27-631ab3a139f3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801642 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801657 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76507e71-1eee-4984-9c27-631ab3a139f3-config\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801675 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801689 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76507e71-1eee-4984-9c27-631ab3a139f3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801725 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9e3236c3-bc74-429a-8932-31c282cf612e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e3236c3-bc74-429a-8932-31c282cf612e\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.801754 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxzd\" (UniqueName: \"kubernetes.io/projected/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-kube-api-access-vgxzd\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.903367 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxzd\" (UniqueName: \"kubernetes.io/projected/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-kube-api-access-vgxzd\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904520 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904547 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-config\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904567 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/042af17b-3a9b-43a0-b270-f52582835b5a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904597 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042af17b-3a9b-43a0-b270-f52582835b5a-config\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904645 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904665 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc8pw\" (UniqueName: \"kubernetes.io/projected/042af17b-3a9b-43a0-b270-f52582835b5a-kube-api-access-xc8pw\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904685 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76507e71-1eee-4984-9c27-631ab3a139f3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904698 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6d6\" (UniqueName: \"kubernetes.io/projected/76507e71-1eee-4984-9c27-631ab3a139f3-kube-api-access-8m6d6\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904716 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/042af17b-3a9b-43a0-b270-f52582835b5a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904744 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904767 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042af17b-3a9b-43a0-b270-f52582835b5a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904783 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76507e71-1eee-4984-9c27-631ab3a139f3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904804 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904820 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76507e71-1eee-4984-9c27-631ab3a139f3-config\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904841 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904856 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76507e71-1eee-4984-9c27-631ab3a139f3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.904901 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9e3236c3-bc74-429a-8932-31c282cf612e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e3236c3-bc74-429a-8932-31c282cf612e\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.905667 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/042af17b-3a9b-43a0-b270-f52582835b5a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.906619 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.906956 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76507e71-1eee-4984-9c27-631ab3a139f3-config\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.907492 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76507e71-1eee-4984-9c27-631ab3a139f3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.908447 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.908735 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-config\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.908826 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/76507e71-1eee-4984-9c27-631ab3a139f3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.909387 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/042af17b-3a9b-43a0-b270-f52582835b5a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.909701 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.909726 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/eee07fb5075df9710675cee8d942e6c1c5e92cf7489e7cb1776c3bd173a2b72a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.910711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042af17b-3a9b-43a0-b270-f52582835b5a-config\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.913189 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.913465 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.913486 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/35d4400b6c7bc6e6ffccab39738386500aec41cb679fc4972004c7087485bfb9/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.914062 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.914081 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9e3236c3-bc74-429a-8932-31c282cf612e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e3236c3-bc74-429a-8932-31c282cf612e\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1342674c303cb768f9db7008577870978a940f8aebf313253f224c6446d1a3e3/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.922088 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/042af17b-3a9b-43a0-b270-f52582835b5a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.923709 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6d6\" (UniqueName: \"kubernetes.io/projected/76507e71-1eee-4984-9c27-631ab3a139f3-kube-api-access-8m6d6\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.928089 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc8pw\" (UniqueName: \"kubernetes.io/projected/042af17b-3a9b-43a0-b270-f52582835b5a-kube-api-access-xc8pw\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.928485 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76507e71-1eee-4984-9c27-631ab3a139f3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.945852 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxzd\" (UniqueName: \"kubernetes.io/projected/5eb2a930-2fde-4ead-a1d6-6b319fddafc7-kube-api-access-vgxzd\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.959322 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01e0e092-8d09-49c9-a5e1-0be4c62a715a\") pod \"ovsdbserver-sb-0\" (UID: \"76507e71-1eee-4984-9c27-631ab3a139f3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.959785 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66d4666f-9f8d-4e48-b524-3c67c8c60dd1\") pod \"ovsdbserver-sb-1\" (UID: \"042af17b-3a9b-43a0-b270-f52582835b5a\") " pod="openstack/ovsdbserver-sb-1" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.961031 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9e3236c3-bc74-429a-8932-31c282cf612e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e3236c3-bc74-429a-8932-31c282cf612e\") pod \"ovsdbserver-sb-2\" (UID: \"5eb2a930-2fde-4ead-a1d6-6b319fddafc7\") " pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.970406 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 16 08:10:59 crc kubenswrapper[4789]: I1216 08:10:59.995750 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 16 08:11:00 crc kubenswrapper[4789]: I1216 08:11:00.024244 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 08:11:00 crc kubenswrapper[4789]: I1216 08:11:00.244699 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 08:11:00 crc kubenswrapper[4789]: I1216 08:11:00.318698 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 16 08:11:00 crc kubenswrapper[4789]: I1216 08:11:00.414607 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 16 08:11:00 crc kubenswrapper[4789]: W1216 08:11:00.423124 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod042af17b_3a9b_43a0_b270_f52582835b5a.slice/crio-8bb423980ea674f49d518e7aed32b48ac6b3392e97aa76eb41961941d85a2df1 WatchSource:0}: Error finding container 8bb423980ea674f49d518e7aed32b48ac6b3392e97aa76eb41961941d85a2df1: Status 404 returned error can't find the container with id 8bb423980ea674f49d518e7aed32b48ac6b3392e97aa76eb41961941d85a2df1 Dec 16 08:11:00 crc kubenswrapper[4789]: I1216 08:11:00.507552 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 16 08:11:00 crc kubenswrapper[4789]: I1216 08:11:00.596986 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 08:11:00 crc kubenswrapper[4789]: W1216 08:11:00.598343 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ce8208_1afd_4f72_bda9_cdb9017e3e51.slice/crio-b1fe7f4e15372f6614f0e6a8797b4a87e904a4ec3a4eac790859758d308ddf11 WatchSource:0}: Error finding container b1fe7f4e15372f6614f0e6a8797b4a87e904a4ec3a4eac790859758d308ddf11: Status 404 returned error can't find the container with id b1fe7f4e15372f6614f0e6a8797b4a87e904a4ec3a4eac790859758d308ddf11 Dec 16 08:11:00 crc kubenswrapper[4789]: I1216 08:11:00.753737 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 08:11:00 crc kubenswrapper[4789]: W1216 08:11:00.775457 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76507e71_1eee_4984_9c27_631ab3a139f3.slice/crio-9cd0326af74583f9b919b8bbecf1d5769fc6c931762f6b1d64f44ff90991b25a WatchSource:0}: Error finding container 9cd0326af74583f9b919b8bbecf1d5769fc6c931762f6b1d64f44ff90991b25a: Status 404 returned error can't find the container with id 9cd0326af74583f9b919b8bbecf1d5769fc6c931762f6b1d64f44ff90991b25a Dec 16 08:11:01 crc kubenswrapper[4789]: I1216 08:11:01.111304 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"042af17b-3a9b-43a0-b270-f52582835b5a","Type":"ContainerStarted","Data":"8bb423980ea674f49d518e7aed32b48ac6b3392e97aa76eb41961941d85a2df1"} Dec 16 08:11:01 crc kubenswrapper[4789]: I1216 08:11:01.113314 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5eb2a930-2fde-4ead-a1d6-6b319fddafc7","Type":"ContainerStarted","Data":"fbf64566f37cd2e28d82b9467601845367df1039c0d84fedd2bfc386c077a87e"} Dec 16 08:11:01 crc kubenswrapper[4789]: I1216 08:11:01.114863 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"0e439248-3fed-4550-9d89-8fec7e155a09","Type":"ContainerStarted","Data":"4db2d388312f554febc3a97f2eb3a5be72a927f6b3bf3bd73533ca1b94515b36"} Dec 16 08:11:01 crc kubenswrapper[4789]: I1216 08:11:01.116213 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"76507e71-1eee-4984-9c27-631ab3a139f3","Type":"ContainerStarted","Data":"9cd0326af74583f9b919b8bbecf1d5769fc6c931762f6b1d64f44ff90991b25a"} Dec 16 08:11:01 crc kubenswrapper[4789]: I1216 08:11:01.117868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"33ce8208-1afd-4f72-bda9-cdb9017e3e51","Type":"ContainerStarted","Data":"b1fe7f4e15372f6614f0e6a8797b4a87e904a4ec3a4eac790859758d308ddf11"} Dec 16 08:11:01 crc kubenswrapper[4789]: I1216 08:11:01.389854 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.155072 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"a51d9137-cd54-4a8d-8217-cebbf247f188","Type":"ContainerStarted","Data":"dcae27730d9620389ebd848fe48c4c119948bb13457548adc3d3668331ee118a"} Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.520369 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvzg"] Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.525521 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.550135 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvzg"] Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.674438 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-catalog-content\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.674543 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgz9g\" (UniqueName: \"kubernetes.io/projected/3df2fd89-0315-4abd-9126-91ee875fbdef-kube-api-access-pgz9g\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.674602 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-utilities\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.776155 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-catalog-content\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.776256 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgz9g\" (UniqueName: \"kubernetes.io/projected/3df2fd89-0315-4abd-9126-91ee875fbdef-kube-api-access-pgz9g\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.776299 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-utilities\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.776788 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-utilities\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.777365 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-catalog-content\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.796372 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgz9g\" (UniqueName: \"kubernetes.io/projected/3df2fd89-0315-4abd-9126-91ee875fbdef-kube-api-access-pgz9g\") pod \"redhat-marketplace-dpvzg\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:02 crc kubenswrapper[4789]: I1216 08:11:02.864972 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:06 crc kubenswrapper[4789]: I1216 08:11:06.184363 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"33ce8208-1afd-4f72-bda9-cdb9017e3e51","Type":"ContainerStarted","Data":"59914264b3052403ed6949edad300091d3f6e6fdd8bc0edf4eb9edb33ae6e783"} Dec 16 08:11:06 crc kubenswrapper[4789]: I1216 08:11:06.190303 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"042af17b-3a9b-43a0-b270-f52582835b5a","Type":"ContainerStarted","Data":"b5f289b089fcb76639317d7487400c0c8956345818992ccf273d1a043cfe1f78"} Dec 16 08:11:06 crc kubenswrapper[4789]: I1216 08:11:06.192291 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"a51d9137-cd54-4a8d-8217-cebbf247f188","Type":"ContainerStarted","Data":"a949aa80143e63f693b4be6ddc826291c5e160f7bdb45065b8444b8ec218a31a"} Dec 16 08:11:06 crc kubenswrapper[4789]: I1216 08:11:06.198215 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5eb2a930-2fde-4ead-a1d6-6b319fddafc7","Type":"ContainerStarted","Data":"6559dc9f5ca0a7ee3e2b654905a8533dfdb76ce340af788d08b08ae9ff037212"} Dec 16 08:11:06 crc kubenswrapper[4789]: I1216 08:11:06.200586 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"0e439248-3fed-4550-9d89-8fec7e155a09","Type":"ContainerStarted","Data":"af63ed0fadef37a8831ee4efd2486387e0b34338b848d90902e569ec5cd96141"} Dec 16 08:11:06 crc kubenswrapper[4789]: I1216 08:11:06.206316 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"76507e71-1eee-4984-9c27-631ab3a139f3","Type":"ContainerStarted","Data":"55309afbfdae4ae95d4a95bb408988fdfa5745d97bf3b56aac7b53b76cc828a8"} Dec 16 08:11:06 crc kubenswrapper[4789]: I1216 08:11:06.288405 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvzg"] Dec 16 08:11:06 crc kubenswrapper[4789]: W1216 08:11:06.300621 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df2fd89_0315_4abd_9126_91ee875fbdef.slice/crio-0887ad28ed3a36a4f3f74bb3219c37c0e04f7ea0fc530c82a815768bc237932e WatchSource:0}: Error finding container 0887ad28ed3a36a4f3f74bb3219c37c0e04f7ea0fc530c82a815768bc237932e: Status 404 returned error can't find the container with id 0887ad28ed3a36a4f3f74bb3219c37c0e04f7ea0fc530c82a815768bc237932e Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.216899 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"33ce8208-1afd-4f72-bda9-cdb9017e3e51","Type":"ContainerStarted","Data":"1d51adba2bbf886aea3700d5e1a4917fb34174e348581e973fd89b2be0f93470"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.223353 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"042af17b-3a9b-43a0-b270-f52582835b5a","Type":"ContainerStarted","Data":"bc542543f0c1acff13214162a28e45fa02c59277cb9240892f663d060fced71d"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.226645 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"a51d9137-cd54-4a8d-8217-cebbf247f188","Type":"ContainerStarted","Data":"95476e01de477faaaf1da5f804b163d7e5b0b38339bcd1b34a9fdcd51c1e9923"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.229729 4789 generic.go:334] "Generic (PLEG): container finished" podID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerID="93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48" exitCode=0 Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.229788 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvzg" event={"ID":"3df2fd89-0315-4abd-9126-91ee875fbdef","Type":"ContainerDied","Data":"93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.229849 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvzg" event={"ID":"3df2fd89-0315-4abd-9126-91ee875fbdef","Type":"ContainerStarted","Data":"0887ad28ed3a36a4f3f74bb3219c37c0e04f7ea0fc530c82a815768bc237932e"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.235690 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5eb2a930-2fde-4ead-a1d6-6b319fddafc7","Type":"ContainerStarted","Data":"cfb0af07c48a5440e84863143d7a7d2baa75321b82d40e0c957382723cafe5cb"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.243547 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"0e439248-3fed-4550-9d89-8fec7e155a09","Type":"ContainerStarted","Data":"05a57c864191a17b3ae122a1d926d8f858247b696e0c5ec533f6f169abc9bd0d"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.244817 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.952381604 podStartE2EDuration="9.244793376s" podCreationTimestamp="2025-12-16 08:10:58 +0000 UTC" firstStartedPulling="2025-12-16 08:11:00.600869369 +0000 UTC m=+4798.862756998" lastFinishedPulling="2025-12-16 08:11:05.893281141 +0000 UTC m=+4804.155168770" observedRunningTime="2025-12-16 08:11:07.237906048 +0000 UTC m=+4805.499793737" watchObservedRunningTime="2025-12-16 08:11:07.244793376 +0000 UTC m=+4805.506681015" Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.247178 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"76507e71-1eee-4984-9c27-631ab3a139f3","Type":"ContainerStarted","Data":"216bb27dbb6b32dc356af8a50b59090d241f6c26763fa47c4a9929908ec3bf13"} Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.276912 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.9006311780000003 podStartE2EDuration="9.27689058s" podCreationTimestamp="2025-12-16 08:10:58 +0000 UTC" firstStartedPulling="2025-12-16 08:11:00.510547612 +0000 UTC m=+4798.772435231" lastFinishedPulling="2025-12-16 08:11:05.886807004 +0000 UTC m=+4804.148694633" observedRunningTime="2025-12-16 08:11:07.271472998 +0000 UTC m=+4805.533360677" watchObservedRunningTime="2025-12-16 08:11:07.27689058 +0000 UTC m=+4805.538778209" Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.296880 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.843644278 podStartE2EDuration="9.296855129s" podCreationTimestamp="2025-12-16 08:10:58 +0000 UTC" firstStartedPulling="2025-12-16 08:11:00.425486624 +0000 UTC m=+4798.687374253" lastFinishedPulling="2025-12-16 08:11:05.878697465 +0000 UTC m=+4804.140585104" observedRunningTime="2025-12-16 08:11:07.290187456 +0000 UTC m=+4805.552075085" watchObservedRunningTime="2025-12-16 08:11:07.296855129 +0000 UTC m=+4805.558742768" Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.329034 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.858837414 podStartE2EDuration="9.329011635s" podCreationTimestamp="2025-12-16 08:10:58 +0000 UTC" firstStartedPulling="2025-12-16 08:11:01.415368442 +0000 UTC m=+4799.677256071" lastFinishedPulling="2025-12-16 08:11:05.885542663 +0000 UTC m=+4804.147430292" observedRunningTime="2025-12-16 08:11:07.323747916 +0000 UTC m=+4805.585635555" watchObservedRunningTime="2025-12-16 08:11:07.329011635 +0000 UTC m=+4805.590899264" Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.346651 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.752703335 podStartE2EDuration="9.346632105s" podCreationTimestamp="2025-12-16 08:10:58 +0000 UTC" firstStartedPulling="2025-12-16 08:11:00.330576724 +0000 UTC m=+4798.592464353" lastFinishedPulling="2025-12-16 08:11:05.924505484 +0000 UTC m=+4804.186393123" observedRunningTime="2025-12-16 08:11:07.3398848 +0000 UTC m=+4805.601772439" watchObservedRunningTime="2025-12-16 08:11:07.346632105 +0000 UTC m=+4805.608519734" Dec 16 08:11:07 crc kubenswrapper[4789]: I1216 08:11:07.369957 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.171101899 podStartE2EDuration="9.369909984s" podCreationTimestamp="2025-12-16 08:10:58 +0000 UTC" firstStartedPulling="2025-12-16 08:11:00.777803233 +0000 UTC m=+4799.039690872" lastFinishedPulling="2025-12-16 08:11:05.976611328 +0000 UTC m=+4804.238498957" observedRunningTime="2025-12-16 08:11:07.360439222 +0000 UTC m=+4805.622326871" watchObservedRunningTime="2025-12-16 08:11:07.369909984 +0000 UTC m=+4805.631797613" Dec 16 08:11:08 crc kubenswrapper[4789]: I1216 08:11:08.258376 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvzg" event={"ID":"3df2fd89-0315-4abd-9126-91ee875fbdef","Type":"ContainerStarted","Data":"55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7"} Dec 16 08:11:08 crc kubenswrapper[4789]: I1216 08:11:08.767016 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 16 08:11:08 crc kubenswrapper[4789]: I1216 08:11:08.778103 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 16 08:11:08 crc kubenswrapper[4789]: I1216 08:11:08.971766 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 16 08:11:08 crc kubenswrapper[4789]: I1216 08:11:08.996325 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.025040 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.073989 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.245791 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.268540 4789 generic.go:334] "Generic (PLEG): container finished" podID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerID="55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7" exitCode=0 Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.268657 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvzg" event={"ID":"3df2fd89-0315-4abd-9126-91ee875fbdef","Type":"ContainerDied","Data":"55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7"} Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.271326 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.322163 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.767520 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.777758 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.971527 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 16 08:11:09 crc kubenswrapper[4789]: I1216 08:11:09.996109 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 16 08:11:10 crc kubenswrapper[4789]: I1216 08:11:10.245684 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 16 08:11:10 crc kubenswrapper[4789]: I1216 08:11:10.291823 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvzg" event={"ID":"3df2fd89-0315-4abd-9126-91ee875fbdef","Type":"ContainerStarted","Data":"4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e"} Dec 16 08:11:10 crc kubenswrapper[4789]: I1216 08:11:10.318825 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dpvzg" podStartSLOduration=5.694168547 podStartE2EDuration="8.318802221s" podCreationTimestamp="2025-12-16 08:11:02 +0000 UTC" firstStartedPulling="2025-12-16 08:11:07.233408378 +0000 UTC m=+4805.495296047" lastFinishedPulling="2025-12-16 08:11:09.858042052 +0000 UTC m=+4808.119929721" observedRunningTime="2025-12-16 08:11:10.317875148 +0000 UTC m=+4808.579762797" watchObservedRunningTime="2025-12-16 08:11:10.318802221 +0000 UTC m=+4808.580689870" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.105474 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:11:11 crc kubenswrapper[4789]: E1216 08:11:11.106026 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.340158 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.347044 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.647499 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76c847f849-pc2rc"] Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.652258 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.656758 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.672312 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c847f849-pc2rc"] Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.743292 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-config\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.743357 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-dns-svc\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.743407 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkxv\" (UniqueName: \"kubernetes.io/projected/225fa97e-e206-4031-8b1a-01c24269fd85-kube-api-access-gqkxv\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.743638 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-ovsdbserver-sb\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.785796 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c847f849-pc2rc"] Dec 16 08:11:11 crc kubenswrapper[4789]: E1216 08:11:11.786354 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-gqkxv ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-76c847f849-pc2rc" podUID="225fa97e-e206-4031-8b1a-01c24269fd85" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.804728 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b4fffdc5f-4mkt4"] Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.805945 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.809381 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.818941 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4fffdc5f-4mkt4"] Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.833553 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.843387 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.845168 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-ovsdbserver-sb\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.845284 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-config\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.845328 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-dns-svc\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.845349 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkxv\" (UniqueName: \"kubernetes.io/projected/225fa97e-e206-4031-8b1a-01c24269fd85-kube-api-access-gqkxv\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.846364 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-config\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.846373 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-dns-svc\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.846855 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-ovsdbserver-sb\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.883531 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkxv\" (UniqueName: \"kubernetes.io/projected/225fa97e-e206-4031-8b1a-01c24269fd85-kube-api-access-gqkxv\") pod \"dnsmasq-dns-76c847f849-pc2rc\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.894772 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.897956 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.947164 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tc67\" (UniqueName: \"kubernetes.io/projected/f39411a9-1613-48ed-9806-9433b9b22ba5-kube-api-access-8tc67\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.947233 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.947312 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.947345 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-dns-svc\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:11 crc kubenswrapper[4789]: I1216 08:11:11.947384 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-config\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.018813 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.041516 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.048792 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tc67\" (UniqueName: \"kubernetes.io/projected/f39411a9-1613-48ed-9806-9433b9b22ba5-kube-api-access-8tc67\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.048834 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.048877 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.048912 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-dns-svc\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.048966 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-config\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.050117 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-config\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.050841 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.051420 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.052091 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-dns-svc\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.058469 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.089624 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tc67\" (UniqueName: \"kubernetes.io/projected/f39411a9-1613-48ed-9806-9433b9b22ba5-kube-api-access-8tc67\") pod \"dnsmasq-dns-6b4fffdc5f-4mkt4\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.115725 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.137432 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.304831 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.328552 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.457746 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-config\") pod \"225fa97e-e206-4031-8b1a-01c24269fd85\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.458162 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-dns-svc\") pod \"225fa97e-e206-4031-8b1a-01c24269fd85\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.458254 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkxv\" (UniqueName: \"kubernetes.io/projected/225fa97e-e206-4031-8b1a-01c24269fd85-kube-api-access-gqkxv\") pod \"225fa97e-e206-4031-8b1a-01c24269fd85\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.458299 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-ovsdbserver-sb\") pod \"225fa97e-e206-4031-8b1a-01c24269fd85\" (UID: \"225fa97e-e206-4031-8b1a-01c24269fd85\") " Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.458841 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "225fa97e-e206-4031-8b1a-01c24269fd85" (UID: "225fa97e-e206-4031-8b1a-01c24269fd85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.458926 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-config" (OuterVolumeSpecName: "config") pod "225fa97e-e206-4031-8b1a-01c24269fd85" (UID: "225fa97e-e206-4031-8b1a-01c24269fd85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.460423 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "225fa97e-e206-4031-8b1a-01c24269fd85" (UID: "225fa97e-e206-4031-8b1a-01c24269fd85"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.478314 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225fa97e-e206-4031-8b1a-01c24269fd85-kube-api-access-gqkxv" (OuterVolumeSpecName: "kube-api-access-gqkxv") pod "225fa97e-e206-4031-8b1a-01c24269fd85" (UID: "225fa97e-e206-4031-8b1a-01c24269fd85"). InnerVolumeSpecName "kube-api-access-gqkxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.560619 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.560671 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.560683 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqkxv\" (UniqueName: \"kubernetes.io/projected/225fa97e-e206-4031-8b1a-01c24269fd85-kube-api-access-gqkxv\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.560695 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/225fa97e-e206-4031-8b1a-01c24269fd85-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.688551 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4fffdc5f-4mkt4"] Dec 16 08:11:12 crc kubenswrapper[4789]: W1216 08:11:12.690167 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf39411a9_1613_48ed_9806_9433b9b22ba5.slice/crio-a684fa63c0e96e73e8a115c73df789c16a3b1c6fbc73fe416ea7d5d55a408132 WatchSource:0}: Error finding container a684fa63c0e96e73e8a115c73df789c16a3b1c6fbc73fe416ea7d5d55a408132: Status 404 returned error can't find the container with id a684fa63c0e96e73e8a115c73df789c16a3b1c6fbc73fe416ea7d5d55a408132 Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.865284 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.866155 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:12 crc kubenswrapper[4789]: I1216 08:11:12.908678 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:13 crc kubenswrapper[4789]: I1216 08:11:13.321304 4789 generic.go:334] "Generic (PLEG): container finished" podID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerID="bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e" exitCode=0 Dec 16 08:11:13 crc kubenswrapper[4789]: I1216 08:11:13.321350 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" event={"ID":"f39411a9-1613-48ed-9806-9433b9b22ba5","Type":"ContainerDied","Data":"bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e"} Dec 16 08:11:13 crc kubenswrapper[4789]: I1216 08:11:13.321397 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" event={"ID":"f39411a9-1613-48ed-9806-9433b9b22ba5","Type":"ContainerStarted","Data":"a684fa63c0e96e73e8a115c73df789c16a3b1c6fbc73fe416ea7d5d55a408132"} Dec 16 08:11:13 crc kubenswrapper[4789]: I1216 08:11:13.321433 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c847f849-pc2rc" Dec 16 08:11:13 crc kubenswrapper[4789]: I1216 08:11:13.469758 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c847f849-pc2rc"] Dec 16 08:11:13 crc kubenswrapper[4789]: I1216 08:11:13.477665 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76c847f849-pc2rc"] Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.114362 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225fa97e-e206-4031-8b1a-01c24269fd85" path="/var/lib/kubelet/pods/225fa97e-e206-4031-8b1a-01c24269fd85/volumes" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.330701 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" event={"ID":"f39411a9-1613-48ed-9806-9433b9b22ba5","Type":"ContainerStarted","Data":"87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6"} Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.352582 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" podStartSLOduration=3.352564888 podStartE2EDuration="3.352564888s" podCreationTimestamp="2025-12-16 08:11:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:11:14.345443494 +0000 UTC m=+4812.607331123" watchObservedRunningTime="2025-12-16 08:11:14.352564888 +0000 UTC m=+4812.614452517" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.762518 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.764041 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.768694 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.797172 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.896034 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4ca28de1-5df8-477e-9068-55523ad13390-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.896088 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7jwj\" (UniqueName: \"kubernetes.io/projected/4ca28de1-5df8-477e-9068-55523ad13390-kube-api-access-b7jwj\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.896273 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.997383 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.997445 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4ca28de1-5df8-477e-9068-55523ad13390-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:14 crc kubenswrapper[4789]: I1216 08:11:14.997467 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7jwj\" (UniqueName: \"kubernetes.io/projected/4ca28de1-5df8-477e-9068-55523ad13390-kube-api-access-b7jwj\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.000437 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.000473 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f849ef9c979fa9ee6088b4daa792413d4a67967d1b8704eb8ec39c6f3d1c6ede/globalmount\"" pod="openstack/ovn-copy-data" Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.011308 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4ca28de1-5df8-477e-9068-55523ad13390-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.015884 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7jwj\" (UniqueName: \"kubernetes.io/projected/4ca28de1-5df8-477e-9068-55523ad13390-kube-api-access-b7jwj\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.029634 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\") pod \"ovn-copy-data\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " pod="openstack/ovn-copy-data" Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.101138 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.338141 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:15 crc kubenswrapper[4789]: I1216 08:11:15.602734 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 08:11:15 crc kubenswrapper[4789]: W1216 08:11:15.605180 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ca28de1_5df8_477e_9068_55523ad13390.slice/crio-9ed0c063f413002d2196de194566ac6589a547263ea761adf484a8f865a3f4a8 WatchSource:0}: Error finding container 9ed0c063f413002d2196de194566ac6589a547263ea761adf484a8f865a3f4a8: Status 404 returned error can't find the container with id 9ed0c063f413002d2196de194566ac6589a547263ea761adf484a8f865a3f4a8 Dec 16 08:11:16 crc kubenswrapper[4789]: I1216 08:11:16.348581 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4ca28de1-5df8-477e-9068-55523ad13390","Type":"ContainerStarted","Data":"a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f"} Dec 16 08:11:16 crc kubenswrapper[4789]: I1216 08:11:16.349154 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4ca28de1-5df8-477e-9068-55523ad13390","Type":"ContainerStarted","Data":"9ed0c063f413002d2196de194566ac6589a547263ea761adf484a8f865a3f4a8"} Dec 16 08:11:16 crc kubenswrapper[4789]: I1216 08:11:16.375000 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.192485307 podStartE2EDuration="3.374972896s" podCreationTimestamp="2025-12-16 08:11:13 +0000 UTC" firstStartedPulling="2025-12-16 08:11:15.6077997 +0000 UTC m=+4813.869687329" lastFinishedPulling="2025-12-16 08:11:15.790287289 +0000 UTC m=+4814.052174918" observedRunningTime="2025-12-16 08:11:16.367562645 +0000 UTC m=+4814.629450354" watchObservedRunningTime="2025-12-16 08:11:16.374972896 +0000 UTC m=+4814.636860565" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.527610 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.529726 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.531940 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-b5r8f" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.532631 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.534624 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.536277 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.612522 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-config\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.612603 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.612631 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-scripts\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.612711 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.612739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrlt2\" (UniqueName: \"kubernetes.io/projected/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-kube-api-access-lrlt2\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.714308 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.714355 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrlt2\" (UniqueName: \"kubernetes.io/projected/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-kube-api-access-lrlt2\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.714438 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-config\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.714497 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.714523 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-scripts\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.715231 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.715590 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-scripts\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.715608 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-config\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.721403 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.730172 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrlt2\" (UniqueName: \"kubernetes.io/projected/2e8dd6cb-78cb-41ae-88e0-2a0b3720d598-kube-api-access-lrlt2\") pod \"ovn-northd-0\" (UID: \"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598\") " pod="openstack/ovn-northd-0" Dec 16 08:11:21 crc kubenswrapper[4789]: I1216 08:11:21.852531 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.139099 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.207416 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-9gvkk"] Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.207734 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" podUID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerName="dnsmasq-dns" containerID="cri-o://cec414a1d477528bfbb8b0c519c7e253da6e5b427769252601980687bac66252" gracePeriod=10 Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.307113 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 08:11:22 crc kubenswrapper[4789]: W1216 08:11:22.315220 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e8dd6cb_78cb_41ae_88e0_2a0b3720d598.slice/crio-9fb4896b4278cc20599a99361e70e42dbcf96c13a6fcac4f7ea429a24311af73 WatchSource:0}: Error finding container 9fb4896b4278cc20599a99361e70e42dbcf96c13a6fcac4f7ea429a24311af73: Status 404 returned error can't find the container with id 9fb4896b4278cc20599a99361e70e42dbcf96c13a6fcac4f7ea429a24311af73 Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.436338 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598","Type":"ContainerStarted","Data":"9fb4896b4278cc20599a99361e70e42dbcf96c13a6fcac4f7ea429a24311af73"} Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.443005 4789 generic.go:334] "Generic (PLEG): container finished" podID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerID="cec414a1d477528bfbb8b0c519c7e253da6e5b427769252601980687bac66252" exitCode=0 Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.443051 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" event={"ID":"0af3676a-e3f5-4a94-a36d-b350e3b22769","Type":"ContainerDied","Data":"cec414a1d477528bfbb8b0c519c7e253da6e5b427769252601980687bac66252"} Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.620408 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.734099 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-dns-svc\") pod \"0af3676a-e3f5-4a94-a36d-b350e3b22769\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.734923 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-config\") pod \"0af3676a-e3f5-4a94-a36d-b350e3b22769\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.735004 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcm4n\" (UniqueName: \"kubernetes.io/projected/0af3676a-e3f5-4a94-a36d-b350e3b22769-kube-api-access-qcm4n\") pod \"0af3676a-e3f5-4a94-a36d-b350e3b22769\" (UID: \"0af3676a-e3f5-4a94-a36d-b350e3b22769\") " Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.739828 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af3676a-e3f5-4a94-a36d-b350e3b22769-kube-api-access-qcm4n" (OuterVolumeSpecName: "kube-api-access-qcm4n") pod "0af3676a-e3f5-4a94-a36d-b350e3b22769" (UID: "0af3676a-e3f5-4a94-a36d-b350e3b22769"). InnerVolumeSpecName "kube-api-access-qcm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.773252 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-config" (OuterVolumeSpecName: "config") pod "0af3676a-e3f5-4a94-a36d-b350e3b22769" (UID: "0af3676a-e3f5-4a94-a36d-b350e3b22769"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.780604 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0af3676a-e3f5-4a94-a36d-b350e3b22769" (UID: "0af3676a-e3f5-4a94-a36d-b350e3b22769"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.836582 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.836929 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcm4n\" (UniqueName: \"kubernetes.io/projected/0af3676a-e3f5-4a94-a36d-b350e3b22769-kube-api-access-qcm4n\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.836941 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0af3676a-e3f5-4a94-a36d-b350e3b22769-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.917825 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:22 crc kubenswrapper[4789]: I1216 08:11:22.961779 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvzg"] Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.453380 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598","Type":"ContainerStarted","Data":"3987e05e6460914263b7fef89e1e68c566179ca11eed85b7b8c02373afdfff53"} Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.453444 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2e8dd6cb-78cb-41ae-88e0-2a0b3720d598","Type":"ContainerStarted","Data":"0b0903de4de7e240224556713cce0ee8ae51d72de3879d4d26a36e9b33678224"} Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.453602 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.455434 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" event={"ID":"0af3676a-e3f5-4a94-a36d-b350e3b22769","Type":"ContainerDied","Data":"aea66839aa342f93398e60512eb78545242773c1c71edc9bcea92cb08b6c9e27"} Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.455486 4789 scope.go:117] "RemoveContainer" containerID="cec414a1d477528bfbb8b0c519c7e253da6e5b427769252601980687bac66252" Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.455493 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55db7cd99c-9gvkk" Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.455498 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dpvzg" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="registry-server" containerID="cri-o://4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e" gracePeriod=2 Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.478226 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.73227421 podStartE2EDuration="2.478204517s" podCreationTimestamp="2025-12-16 08:11:21 +0000 UTC" firstStartedPulling="2025-12-16 08:11:22.318590651 +0000 UTC m=+4820.580478280" lastFinishedPulling="2025-12-16 08:11:23.064520948 +0000 UTC m=+4821.326408587" observedRunningTime="2025-12-16 08:11:23.471666067 +0000 UTC m=+4821.733553696" watchObservedRunningTime="2025-12-16 08:11:23.478204517 +0000 UTC m=+4821.740092146" Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.491615 4789 scope.go:117] "RemoveContainer" containerID="98c73c84bbd9e6e552dcfa4296254caefc9be43e5775f19646015cc31e5915f0" Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.500991 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-9gvkk"] Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.506516 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55db7cd99c-9gvkk"] Dec 16 08:11:23 crc kubenswrapper[4789]: I1216 08:11:23.945609 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.054540 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-catalog-content\") pod \"3df2fd89-0315-4abd-9126-91ee875fbdef\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.054574 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-utilities\") pod \"3df2fd89-0315-4abd-9126-91ee875fbdef\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.054701 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgz9g\" (UniqueName: \"kubernetes.io/projected/3df2fd89-0315-4abd-9126-91ee875fbdef-kube-api-access-pgz9g\") pod \"3df2fd89-0315-4abd-9126-91ee875fbdef\" (UID: \"3df2fd89-0315-4abd-9126-91ee875fbdef\") " Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.055599 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-utilities" (OuterVolumeSpecName: "utilities") pod "3df2fd89-0315-4abd-9126-91ee875fbdef" (UID: "3df2fd89-0315-4abd-9126-91ee875fbdef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.060738 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df2fd89-0315-4abd-9126-91ee875fbdef-kube-api-access-pgz9g" (OuterVolumeSpecName: "kube-api-access-pgz9g") pod "3df2fd89-0315-4abd-9126-91ee875fbdef" (UID: "3df2fd89-0315-4abd-9126-91ee875fbdef"). InnerVolumeSpecName "kube-api-access-pgz9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.080770 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3df2fd89-0315-4abd-9126-91ee875fbdef" (UID: "3df2fd89-0315-4abd-9126-91ee875fbdef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.127249 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af3676a-e3f5-4a94-a36d-b350e3b22769" path="/var/lib/kubelet/pods/0af3676a-e3f5-4a94-a36d-b350e3b22769/volumes" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.156550 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.156581 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df2fd89-0315-4abd-9126-91ee875fbdef-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.156617 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgz9g\" (UniqueName: \"kubernetes.io/projected/3df2fd89-0315-4abd-9126-91ee875fbdef-kube-api-access-pgz9g\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.474133 4789 generic.go:334] "Generic (PLEG): container finished" podID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerID="4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e" exitCode=0 Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.474229 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvzg" event={"ID":"3df2fd89-0315-4abd-9126-91ee875fbdef","Type":"ContainerDied","Data":"4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e"} Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.474280 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpvzg" event={"ID":"3df2fd89-0315-4abd-9126-91ee875fbdef","Type":"ContainerDied","Data":"0887ad28ed3a36a4f3f74bb3219c37c0e04f7ea0fc530c82a815768bc237932e"} Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.474299 4789 scope.go:117] "RemoveContainer" containerID="4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.474369 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpvzg" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.503678 4789 scope.go:117] "RemoveContainer" containerID="55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.506837 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvzg"] Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.515460 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpvzg"] Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.547262 4789 scope.go:117] "RemoveContainer" containerID="93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.563218 4789 scope.go:117] "RemoveContainer" containerID="4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e" Dec 16 08:11:24 crc kubenswrapper[4789]: E1216 08:11:24.563901 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e\": container with ID starting with 4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e not found: ID does not exist" containerID="4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.563969 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e"} err="failed to get container status \"4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e\": rpc error: code = NotFound desc = could not find container \"4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e\": container with ID starting with 4b564e85877ef0a1ac59ffe1fa26e6e25a07d4b0557719f2697df1b7fe41115e not found: ID does not exist" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.563993 4789 scope.go:117] "RemoveContainer" containerID="55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7" Dec 16 08:11:24 crc kubenswrapper[4789]: E1216 08:11:24.564425 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7\": container with ID starting with 55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7 not found: ID does not exist" containerID="55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.564458 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7"} err="failed to get container status \"55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7\": rpc error: code = NotFound desc = could not find container \"55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7\": container with ID starting with 55bbf10b580a226e91409be826ef0ad32fb7a352400168592b3feb084f03adb7 not found: ID does not exist" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.564481 4789 scope.go:117] "RemoveContainer" containerID="93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48" Dec 16 08:11:24 crc kubenswrapper[4789]: E1216 08:11:24.565085 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48\": container with ID starting with 93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48 not found: ID does not exist" containerID="93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48" Dec 16 08:11:24 crc kubenswrapper[4789]: I1216 08:11:24.565128 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48"} err="failed to get container status \"93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48\": rpc error: code = NotFound desc = could not find container \"93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48\": container with ID starting with 93bfc622951ce0c53098583710fffa27331fe396b82b413bd5e42be3280ada48 not found: ID does not exist" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.105308 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:11:26 crc kubenswrapper[4789]: E1216 08:11:26.106298 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.116155 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" path="/var/lib/kubelet/pods/3df2fd89-0315-4abd-9126-91ee875fbdef/volumes" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.375735 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fffh6"] Dec 16 08:11:26 crc kubenswrapper[4789]: E1216 08:11:26.376303 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="extract-content" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.376327 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="extract-content" Dec 16 08:11:26 crc kubenswrapper[4789]: E1216 08:11:26.376344 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="extract-utilities" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.376352 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="extract-utilities" Dec 16 08:11:26 crc kubenswrapper[4789]: E1216 08:11:26.376373 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerName="dnsmasq-dns" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.376381 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerName="dnsmasq-dns" Dec 16 08:11:26 crc kubenswrapper[4789]: E1216 08:11:26.376393 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="registry-server" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.376400 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="registry-server" Dec 16 08:11:26 crc kubenswrapper[4789]: E1216 08:11:26.376421 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerName="init" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.376428 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerName="init" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.376591 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af3676a-e3f5-4a94-a36d-b350e3b22769" containerName="dnsmasq-dns" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.376605 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df2fd89-0315-4abd-9126-91ee875fbdef" containerName="registry-server" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.377337 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.385763 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fffh6"] Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.473821 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-709f-account-create-update-vw46j"] Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.475031 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.478276 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.482859 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-709f-account-create-update-vw46j"] Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.503071 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201ce99d-6f07-4de4-a84e-ce221215a532-operator-scripts\") pod \"keystone-db-create-fffh6\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.503404 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kp7\" (UniqueName: \"kubernetes.io/projected/201ce99d-6f07-4de4-a84e-ce221215a532-kube-api-access-q2kp7\") pod \"keystone-db-create-fffh6\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.605181 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201ce99d-6f07-4de4-a84e-ce221215a532-operator-scripts\") pod \"keystone-db-create-fffh6\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.605298 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aad970bb-e71b-431c-ae77-d85133144832-operator-scripts\") pod \"keystone-709f-account-create-update-vw46j\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.605343 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncfbm\" (UniqueName: \"kubernetes.io/projected/aad970bb-e71b-431c-ae77-d85133144832-kube-api-access-ncfbm\") pod \"keystone-709f-account-create-update-vw46j\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.605369 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kp7\" (UniqueName: \"kubernetes.io/projected/201ce99d-6f07-4de4-a84e-ce221215a532-kube-api-access-q2kp7\") pod \"keystone-db-create-fffh6\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.606254 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201ce99d-6f07-4de4-a84e-ce221215a532-operator-scripts\") pod \"keystone-db-create-fffh6\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.628214 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kp7\" (UniqueName: \"kubernetes.io/projected/201ce99d-6f07-4de4-a84e-ce221215a532-kube-api-access-q2kp7\") pod \"keystone-db-create-fffh6\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.694562 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.706576 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncfbm\" (UniqueName: \"kubernetes.io/projected/aad970bb-e71b-431c-ae77-d85133144832-kube-api-access-ncfbm\") pod \"keystone-709f-account-create-update-vw46j\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.706770 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aad970bb-e71b-431c-ae77-d85133144832-operator-scripts\") pod \"keystone-709f-account-create-update-vw46j\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.707421 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aad970bb-e71b-431c-ae77-d85133144832-operator-scripts\") pod \"keystone-709f-account-create-update-vw46j\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.724818 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncfbm\" (UniqueName: \"kubernetes.io/projected/aad970bb-e71b-431c-ae77-d85133144832-kube-api-access-ncfbm\") pod \"keystone-709f-account-create-update-vw46j\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:26 crc kubenswrapper[4789]: I1216 08:11:26.789470 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:27 crc kubenswrapper[4789]: I1216 08:11:27.160760 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fffh6"] Dec 16 08:11:27 crc kubenswrapper[4789]: W1216 08:11:27.169671 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod201ce99d_6f07_4de4_a84e_ce221215a532.slice/crio-1e05769c46987b3552a6de1c3d3cdc25fa482a47f3ea47be5c1cfeed7c6066db WatchSource:0}: Error finding container 1e05769c46987b3552a6de1c3d3cdc25fa482a47f3ea47be5c1cfeed7c6066db: Status 404 returned error can't find the container with id 1e05769c46987b3552a6de1c3d3cdc25fa482a47f3ea47be5c1cfeed7c6066db Dec 16 08:11:27 crc kubenswrapper[4789]: I1216 08:11:27.273598 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-709f-account-create-update-vw46j"] Dec 16 08:11:27 crc kubenswrapper[4789]: W1216 08:11:27.278524 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaad970bb_e71b_431c_ae77_d85133144832.slice/crio-383b67d4c3c4e5e0ce7b65c03d6bc04de4c34a2c07861623a2f8bf452b16e6e7 WatchSource:0}: Error finding container 383b67d4c3c4e5e0ce7b65c03d6bc04de4c34a2c07861623a2f8bf452b16e6e7: Status 404 returned error can't find the container with id 383b67d4c3c4e5e0ce7b65c03d6bc04de4c34a2c07861623a2f8bf452b16e6e7 Dec 16 08:11:27 crc kubenswrapper[4789]: I1216 08:11:27.521301 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-709f-account-create-update-vw46j" event={"ID":"aad970bb-e71b-431c-ae77-d85133144832","Type":"ContainerStarted","Data":"ad6dce7d1055a8a8e4d640acae27762ad6b5495e5b83fa99a838a7bc2eedc9bb"} Dec 16 08:11:27 crc kubenswrapper[4789]: I1216 08:11:27.521358 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-709f-account-create-update-vw46j" event={"ID":"aad970bb-e71b-431c-ae77-d85133144832","Type":"ContainerStarted","Data":"383b67d4c3c4e5e0ce7b65c03d6bc04de4c34a2c07861623a2f8bf452b16e6e7"} Dec 16 08:11:27 crc kubenswrapper[4789]: I1216 08:11:27.524482 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fffh6" event={"ID":"201ce99d-6f07-4de4-a84e-ce221215a532","Type":"ContainerStarted","Data":"7eea37e3283a63fbfa33adb2ac6bcc7bce7091e96b48f61a59c32234fc3413da"} Dec 16 08:11:27 crc kubenswrapper[4789]: I1216 08:11:27.524512 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fffh6" event={"ID":"201ce99d-6f07-4de4-a84e-ce221215a532","Type":"ContainerStarted","Data":"1e05769c46987b3552a6de1c3d3cdc25fa482a47f3ea47be5c1cfeed7c6066db"} Dec 16 08:11:27 crc kubenswrapper[4789]: I1216 08:11:27.536649 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-709f-account-create-update-vw46j" podStartSLOduration=1.536635336 podStartE2EDuration="1.536635336s" podCreationTimestamp="2025-12-16 08:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:11:27.536480602 +0000 UTC m=+4825.798368231" watchObservedRunningTime="2025-12-16 08:11:27.536635336 +0000 UTC m=+4825.798522965" Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.533635 4789 generic.go:334] "Generic (PLEG): container finished" podID="aad970bb-e71b-431c-ae77-d85133144832" containerID="ad6dce7d1055a8a8e4d640acae27762ad6b5495e5b83fa99a838a7bc2eedc9bb" exitCode=0 Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.533760 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-709f-account-create-update-vw46j" event={"ID":"aad970bb-e71b-431c-ae77-d85133144832","Type":"ContainerDied","Data":"ad6dce7d1055a8a8e4d640acae27762ad6b5495e5b83fa99a838a7bc2eedc9bb"} Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.537375 4789 generic.go:334] "Generic (PLEG): container finished" podID="201ce99d-6f07-4de4-a84e-ce221215a532" containerID="7eea37e3283a63fbfa33adb2ac6bcc7bce7091e96b48f61a59c32234fc3413da" exitCode=0 Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.537455 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fffh6" event={"ID":"201ce99d-6f07-4de4-a84e-ce221215a532","Type":"ContainerDied","Data":"7eea37e3283a63fbfa33adb2ac6bcc7bce7091e96b48f61a59c32234fc3413da"} Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.839725 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.947277 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201ce99d-6f07-4de4-a84e-ce221215a532-operator-scripts\") pod \"201ce99d-6f07-4de4-a84e-ce221215a532\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.947324 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2kp7\" (UniqueName: \"kubernetes.io/projected/201ce99d-6f07-4de4-a84e-ce221215a532-kube-api-access-q2kp7\") pod \"201ce99d-6f07-4de4-a84e-ce221215a532\" (UID: \"201ce99d-6f07-4de4-a84e-ce221215a532\") " Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.947725 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201ce99d-6f07-4de4-a84e-ce221215a532-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "201ce99d-6f07-4de4-a84e-ce221215a532" (UID: "201ce99d-6f07-4de4-a84e-ce221215a532"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:28 crc kubenswrapper[4789]: I1216 08:11:28.954169 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201ce99d-6f07-4de4-a84e-ce221215a532-kube-api-access-q2kp7" (OuterVolumeSpecName: "kube-api-access-q2kp7") pod "201ce99d-6f07-4de4-a84e-ce221215a532" (UID: "201ce99d-6f07-4de4-a84e-ce221215a532"). InnerVolumeSpecName "kube-api-access-q2kp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.048786 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201ce99d-6f07-4de4-a84e-ce221215a532-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.048816 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2kp7\" (UniqueName: \"kubernetes.io/projected/201ce99d-6f07-4de4-a84e-ce221215a532-kube-api-access-q2kp7\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.546807 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fffh6" Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.546838 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fffh6" event={"ID":"201ce99d-6f07-4de4-a84e-ce221215a532","Type":"ContainerDied","Data":"1e05769c46987b3552a6de1c3d3cdc25fa482a47f3ea47be5c1cfeed7c6066db"} Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.546957 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e05769c46987b3552a6de1c3d3cdc25fa482a47f3ea47be5c1cfeed7c6066db" Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.899701 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.962426 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncfbm\" (UniqueName: \"kubernetes.io/projected/aad970bb-e71b-431c-ae77-d85133144832-kube-api-access-ncfbm\") pod \"aad970bb-e71b-431c-ae77-d85133144832\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.962601 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aad970bb-e71b-431c-ae77-d85133144832-operator-scripts\") pod \"aad970bb-e71b-431c-ae77-d85133144832\" (UID: \"aad970bb-e71b-431c-ae77-d85133144832\") " Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.963116 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad970bb-e71b-431c-ae77-d85133144832-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aad970bb-e71b-431c-ae77-d85133144832" (UID: "aad970bb-e71b-431c-ae77-d85133144832"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:29 crc kubenswrapper[4789]: I1216 08:11:29.965477 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad970bb-e71b-431c-ae77-d85133144832-kube-api-access-ncfbm" (OuterVolumeSpecName: "kube-api-access-ncfbm") pod "aad970bb-e71b-431c-ae77-d85133144832" (UID: "aad970bb-e71b-431c-ae77-d85133144832"). InnerVolumeSpecName "kube-api-access-ncfbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:30 crc kubenswrapper[4789]: I1216 08:11:30.065149 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncfbm\" (UniqueName: \"kubernetes.io/projected/aad970bb-e71b-431c-ae77-d85133144832-kube-api-access-ncfbm\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:30 crc kubenswrapper[4789]: I1216 08:11:30.065207 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aad970bb-e71b-431c-ae77-d85133144832-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:30 crc kubenswrapper[4789]: I1216 08:11:30.558318 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-709f-account-create-update-vw46j" event={"ID":"aad970bb-e71b-431c-ae77-d85133144832","Type":"ContainerDied","Data":"383b67d4c3c4e5e0ce7b65c03d6bc04de4c34a2c07861623a2f8bf452b16e6e7"} Dec 16 08:11:30 crc kubenswrapper[4789]: I1216 08:11:30.559038 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="383b67d4c3c4e5e0ce7b65c03d6bc04de4c34a2c07861623a2f8bf452b16e6e7" Dec 16 08:11:30 crc kubenswrapper[4789]: I1216 08:11:30.558432 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-709f-account-create-update-vw46j" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.924245 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-k6fzp"] Dec 16 08:11:31 crc kubenswrapper[4789]: E1216 08:11:31.924628 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad970bb-e71b-431c-ae77-d85133144832" containerName="mariadb-account-create-update" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.924644 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad970bb-e71b-431c-ae77-d85133144832" containerName="mariadb-account-create-update" Dec 16 08:11:31 crc kubenswrapper[4789]: E1216 08:11:31.924662 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201ce99d-6f07-4de4-a84e-ce221215a532" containerName="mariadb-database-create" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.924668 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="201ce99d-6f07-4de4-a84e-ce221215a532" containerName="mariadb-database-create" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.924815 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="201ce99d-6f07-4de4-a84e-ce221215a532" containerName="mariadb-database-create" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.924827 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad970bb-e71b-431c-ae77-d85133144832" containerName="mariadb-account-create-update" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.925489 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.933562 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkpmb" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.933857 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.934024 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.934172 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.937559 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k6fzp"] Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.996564 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-combined-ca-bundle\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.996623 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzjs\" (UniqueName: \"kubernetes.io/projected/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-kube-api-access-4zzjs\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:31 crc kubenswrapper[4789]: I1216 08:11:31.996759 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-config-data\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.097930 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-config-data\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.098336 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-combined-ca-bundle\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.098379 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzjs\" (UniqueName: \"kubernetes.io/projected/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-kube-api-access-4zzjs\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.103627 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-combined-ca-bundle\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.103666 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-config-data\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.117433 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzjs\" (UniqueName: \"kubernetes.io/projected/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-kube-api-access-4zzjs\") pod \"keystone-db-sync-k6fzp\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.265001 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:32 crc kubenswrapper[4789]: I1216 08:11:32.711465 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-k6fzp"] Dec 16 08:11:32 crc kubenswrapper[4789]: W1216 08:11:32.723344 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd4f4d7c_b258_4ad0_92eb_42b86ba4ce2e.slice/crio-ff29f6ce7e4aa8dcca7dce21727fcaa4a9652efe1e35e9baa86f4b4a4226dd27 WatchSource:0}: Error finding container ff29f6ce7e4aa8dcca7dce21727fcaa4a9652efe1e35e9baa86f4b4a4226dd27: Status 404 returned error can't find the container with id ff29f6ce7e4aa8dcca7dce21727fcaa4a9652efe1e35e9baa86f4b4a4226dd27 Dec 16 08:11:33 crc kubenswrapper[4789]: I1216 08:11:33.584542 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k6fzp" event={"ID":"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e","Type":"ContainerStarted","Data":"ff29f6ce7e4aa8dcca7dce21727fcaa4a9652efe1e35e9baa86f4b4a4226dd27"} Dec 16 08:11:36 crc kubenswrapper[4789]: I1216 08:11:36.934585 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 16 08:11:37 crc kubenswrapper[4789]: I1216 08:11:37.105302 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:11:37 crc kubenswrapper[4789]: E1216 08:11:37.105582 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:11:38 crc kubenswrapper[4789]: I1216 08:11:38.644325 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k6fzp" event={"ID":"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e","Type":"ContainerStarted","Data":"937f80c64f7219d4b576f08960d8e0c2b347ac2e606386ec32fa85857201ac27"} Dec 16 08:11:38 crc kubenswrapper[4789]: I1216 08:11:38.670869 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-k6fzp" podStartSLOduration=3.025076552 podStartE2EDuration="7.670838535s" podCreationTimestamp="2025-12-16 08:11:31 +0000 UTC" firstStartedPulling="2025-12-16 08:11:32.727453125 +0000 UTC m=+4830.989340764" lastFinishedPulling="2025-12-16 08:11:37.373215108 +0000 UTC m=+4835.635102747" observedRunningTime="2025-12-16 08:11:38.665819223 +0000 UTC m=+4836.927706862" watchObservedRunningTime="2025-12-16 08:11:38.670838535 +0000 UTC m=+4836.932726164" Dec 16 08:11:39 crc kubenswrapper[4789]: I1216 08:11:39.659430 4789 generic.go:334] "Generic (PLEG): container finished" podID="cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" containerID="937f80c64f7219d4b576f08960d8e0c2b347ac2e606386ec32fa85857201ac27" exitCode=0 Dec 16 08:11:39 crc kubenswrapper[4789]: I1216 08:11:39.659534 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k6fzp" event={"ID":"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e","Type":"ContainerDied","Data":"937f80c64f7219d4b576f08960d8e0c2b347ac2e606386ec32fa85857201ac27"} Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.064025 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.164131 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zzjs\" (UniqueName: \"kubernetes.io/projected/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-kube-api-access-4zzjs\") pod \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.164193 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-config-data\") pod \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.164257 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-combined-ca-bundle\") pod \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\" (UID: \"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e\") " Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.169725 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-kube-api-access-4zzjs" (OuterVolumeSpecName: "kube-api-access-4zzjs") pod "cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" (UID: "cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e"). InnerVolumeSpecName "kube-api-access-4zzjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.197404 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" (UID: "cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.205114 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-config-data" (OuterVolumeSpecName: "config-data") pod "cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" (UID: "cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.266126 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zzjs\" (UniqueName: \"kubernetes.io/projected/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-kube-api-access-4zzjs\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.266389 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.266451 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.688282 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-k6fzp" event={"ID":"cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e","Type":"ContainerDied","Data":"ff29f6ce7e4aa8dcca7dce21727fcaa4a9652efe1e35e9baa86f4b4a4226dd27"} Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.688341 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff29f6ce7e4aa8dcca7dce21727fcaa4a9652efe1e35e9baa86f4b4a4226dd27" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.688326 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-k6fzp" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.925371 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-648dfd6c8f-bdbc6"] Dec 16 08:11:41 crc kubenswrapper[4789]: E1216 08:11:41.925702 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" containerName="keystone-db-sync" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.925718 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" containerName="keystone-db-sync" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.925872 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" containerName="keystone-db-sync" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.927028 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.941828 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648dfd6c8f-bdbc6"] Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.983267 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-sb\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.983352 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-dns-svc\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.983582 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-config\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.983673 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gfg9\" (UniqueName: \"kubernetes.io/projected/9aa7ce20-5022-4daa-af09-1e3c111449c5-kube-api-access-9gfg9\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.983732 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-nb\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.990958 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-h7m6c"] Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.992899 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.995672 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.996435 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.996574 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.996680 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:11:41 crc kubenswrapper[4789]: I1216 08:11:41.996879 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkpmb" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:41.999231 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h7m6c"] Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085556 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-config\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085619 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gfg9\" (UniqueName: \"kubernetes.io/projected/9aa7ce20-5022-4daa-af09-1e3c111449c5-kube-api-access-9gfg9\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085659 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-nb\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085682 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-combined-ca-bundle\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085717 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-credential-keys\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085747 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-sb\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085781 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-dns-svc\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085803 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-scripts\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085841 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-fernet-keys\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085875 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmcz\" (UniqueName: \"kubernetes.io/projected/1ee964bd-39c0-422c-b883-8aad701331ae-kube-api-access-6gmcz\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.085892 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-config-data\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.087922 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-sb\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.087997 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-dns-svc\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.088052 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-nb\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.088326 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-config\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.103806 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gfg9\" (UniqueName: \"kubernetes.io/projected/9aa7ce20-5022-4daa-af09-1e3c111449c5-kube-api-access-9gfg9\") pod \"dnsmasq-dns-648dfd6c8f-bdbc6\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.187445 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-fernet-keys\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.187761 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmcz\" (UniqueName: \"kubernetes.io/projected/1ee964bd-39c0-422c-b883-8aad701331ae-kube-api-access-6gmcz\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.187803 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-config-data\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.187877 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-combined-ca-bundle\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.187952 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-credential-keys\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.188020 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-scripts\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.191631 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-scripts\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.192463 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-combined-ca-bundle\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.192700 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-config-data\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.194521 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-fernet-keys\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.194858 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-credential-keys\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.205891 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmcz\" (UniqueName: \"kubernetes.io/projected/1ee964bd-39c0-422c-b883-8aad701331ae-kube-api-access-6gmcz\") pod \"keystone-bootstrap-h7m6c\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.257969 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.324078 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.701968 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648dfd6c8f-bdbc6"] Dec 16 08:11:42 crc kubenswrapper[4789]: W1216 08:11:42.708657 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa7ce20_5022_4daa_af09_1e3c111449c5.slice/crio-4edd4d711a8f7f87de33a11e84533850b2a19d931debc48d6ca904f7e81e8c2e WatchSource:0}: Error finding container 4edd4d711a8f7f87de33a11e84533850b2a19d931debc48d6ca904f7e81e8c2e: Status 404 returned error can't find the container with id 4edd4d711a8f7f87de33a11e84533850b2a19d931debc48d6ca904f7e81e8c2e Dec 16 08:11:42 crc kubenswrapper[4789]: I1216 08:11:42.797816 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h7m6c"] Dec 16 08:11:42 crc kubenswrapper[4789]: W1216 08:11:42.803307 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee964bd_39c0_422c_b883_8aad701331ae.slice/crio-e7ecf9563dc3865d4b7d971df1ee867120fe7766bec68069fa1779340fc257f2 WatchSource:0}: Error finding container e7ecf9563dc3865d4b7d971df1ee867120fe7766bec68069fa1779340fc257f2: Status 404 returned error can't find the container with id e7ecf9563dc3865d4b7d971df1ee867120fe7766bec68069fa1779340fc257f2 Dec 16 08:11:43 crc kubenswrapper[4789]: I1216 08:11:43.708695 4789 generic.go:334] "Generic (PLEG): container finished" podID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerID="69df9522049cf9191373a4f1f282abcc668bf6b2f8559c3ba89e14bad7483b92" exitCode=0 Dec 16 08:11:43 crc kubenswrapper[4789]: I1216 08:11:43.708753 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" event={"ID":"9aa7ce20-5022-4daa-af09-1e3c111449c5","Type":"ContainerDied","Data":"69df9522049cf9191373a4f1f282abcc668bf6b2f8559c3ba89e14bad7483b92"} Dec 16 08:11:43 crc kubenswrapper[4789]: I1216 08:11:43.709060 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" event={"ID":"9aa7ce20-5022-4daa-af09-1e3c111449c5","Type":"ContainerStarted","Data":"4edd4d711a8f7f87de33a11e84533850b2a19d931debc48d6ca904f7e81e8c2e"} Dec 16 08:11:43 crc kubenswrapper[4789]: I1216 08:11:43.710677 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h7m6c" event={"ID":"1ee964bd-39c0-422c-b883-8aad701331ae","Type":"ContainerStarted","Data":"8061226e9f7920b0df6606b4cfb42c79c665529ac1476c77a0b98d6e652476e6"} Dec 16 08:11:43 crc kubenswrapper[4789]: I1216 08:11:43.710710 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h7m6c" event={"ID":"1ee964bd-39c0-422c-b883-8aad701331ae","Type":"ContainerStarted","Data":"e7ecf9563dc3865d4b7d971df1ee867120fe7766bec68069fa1779340fc257f2"} Dec 16 08:11:43 crc kubenswrapper[4789]: I1216 08:11:43.762583 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-h7m6c" podStartSLOduration=2.762566024 podStartE2EDuration="2.762566024s" podCreationTimestamp="2025-12-16 08:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:11:43.757217513 +0000 UTC m=+4842.019105152" watchObservedRunningTime="2025-12-16 08:11:43.762566024 +0000 UTC m=+4842.024453653" Dec 16 08:11:44 crc kubenswrapper[4789]: I1216 08:11:44.719093 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" event={"ID":"9aa7ce20-5022-4daa-af09-1e3c111449c5","Type":"ContainerStarted","Data":"a4e80ba7f5a8006ed4e23610e0c2d8e874b589f56f62dd9713e976a0f6e9dfb2"} Dec 16 08:11:44 crc kubenswrapper[4789]: I1216 08:11:44.720101 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:44 crc kubenswrapper[4789]: I1216 08:11:44.754893 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" podStartSLOduration=3.754869081 podStartE2EDuration="3.754869081s" podCreationTimestamp="2025-12-16 08:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:11:44.753661592 +0000 UTC m=+4843.015549231" watchObservedRunningTime="2025-12-16 08:11:44.754869081 +0000 UTC m=+4843.016756720" Dec 16 08:11:46 crc kubenswrapper[4789]: I1216 08:11:46.734532 4789 generic.go:334] "Generic (PLEG): container finished" podID="1ee964bd-39c0-422c-b883-8aad701331ae" containerID="8061226e9f7920b0df6606b4cfb42c79c665529ac1476c77a0b98d6e652476e6" exitCode=0 Dec 16 08:11:46 crc kubenswrapper[4789]: I1216 08:11:46.734601 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h7m6c" event={"ID":"1ee964bd-39c0-422c-b883-8aad701331ae","Type":"ContainerDied","Data":"8061226e9f7920b0df6606b4cfb42c79c665529ac1476c77a0b98d6e652476e6"} Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.101089 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.199875 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gmcz\" (UniqueName: \"kubernetes.io/projected/1ee964bd-39c0-422c-b883-8aad701331ae-kube-api-access-6gmcz\") pod \"1ee964bd-39c0-422c-b883-8aad701331ae\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.200039 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-config-data\") pod \"1ee964bd-39c0-422c-b883-8aad701331ae\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.200134 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-fernet-keys\") pod \"1ee964bd-39c0-422c-b883-8aad701331ae\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.200293 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-combined-ca-bundle\") pod \"1ee964bd-39c0-422c-b883-8aad701331ae\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.200338 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-credential-keys\") pod \"1ee964bd-39c0-422c-b883-8aad701331ae\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.200414 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-scripts\") pod \"1ee964bd-39c0-422c-b883-8aad701331ae\" (UID: \"1ee964bd-39c0-422c-b883-8aad701331ae\") " Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.206093 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-scripts" (OuterVolumeSpecName: "scripts") pod "1ee964bd-39c0-422c-b883-8aad701331ae" (UID: "1ee964bd-39c0-422c-b883-8aad701331ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.206349 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1ee964bd-39c0-422c-b883-8aad701331ae" (UID: "1ee964bd-39c0-422c-b883-8aad701331ae"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.206495 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1ee964bd-39c0-422c-b883-8aad701331ae" (UID: "1ee964bd-39c0-422c-b883-8aad701331ae"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.206778 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee964bd-39c0-422c-b883-8aad701331ae-kube-api-access-6gmcz" (OuterVolumeSpecName: "kube-api-access-6gmcz") pod "1ee964bd-39c0-422c-b883-8aad701331ae" (UID: "1ee964bd-39c0-422c-b883-8aad701331ae"). InnerVolumeSpecName "kube-api-access-6gmcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.223893 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ee964bd-39c0-422c-b883-8aad701331ae" (UID: "1ee964bd-39c0-422c-b883-8aad701331ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.225458 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-config-data" (OuterVolumeSpecName: "config-data") pod "1ee964bd-39c0-422c-b883-8aad701331ae" (UID: "1ee964bd-39c0-422c-b883-8aad701331ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.302376 4789 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.302406 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.302430 4789 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.302438 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.302448 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gmcz\" (UniqueName: \"kubernetes.io/projected/1ee964bd-39c0-422c-b883-8aad701331ae-kube-api-access-6gmcz\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.302456 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee964bd-39c0-422c-b883-8aad701331ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.753643 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h7m6c" event={"ID":"1ee964bd-39c0-422c-b883-8aad701331ae","Type":"ContainerDied","Data":"e7ecf9563dc3865d4b7d971df1ee867120fe7766bec68069fa1779340fc257f2"} Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.753692 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h7m6c" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.753699 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7ecf9563dc3865d4b7d971df1ee867120fe7766bec68069fa1779340fc257f2" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.832311 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-h7m6c"] Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.839188 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-h7m6c"] Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.933652 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fnzlk"] Dec 16 08:11:48 crc kubenswrapper[4789]: E1216 08:11:48.934267 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee964bd-39c0-422c-b883-8aad701331ae" containerName="keystone-bootstrap" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.934281 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee964bd-39c0-422c-b883-8aad701331ae" containerName="keystone-bootstrap" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.934508 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee964bd-39c0-422c-b883-8aad701331ae" containerName="keystone-bootstrap" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.935331 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.937344 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.937695 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.938621 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkpmb" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.941663 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.941841 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:11:48 crc kubenswrapper[4789]: I1216 08:11:48.943355 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fnzlk"] Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.014741 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtndd\" (UniqueName: \"kubernetes.io/projected/ed5961a3-6529-4866-ba46-acc8e57c0dc1-kube-api-access-mtndd\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.014813 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-fernet-keys\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.014835 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-config-data\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.014992 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-combined-ca-bundle\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.015053 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-credential-keys\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.015214 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-scripts\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.106324 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:11:49 crc kubenswrapper[4789]: E1216 08:11:49.106802 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.117468 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-combined-ca-bundle\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.117717 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-credential-keys\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.117814 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-scripts\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.117937 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtndd\" (UniqueName: \"kubernetes.io/projected/ed5961a3-6529-4866-ba46-acc8e57c0dc1-kube-api-access-mtndd\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.118044 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-fernet-keys\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.118082 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-config-data\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.121112 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-combined-ca-bundle\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.121243 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-credential-keys\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.121805 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-scripts\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.121840 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-config-data\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.122550 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-fernet-keys\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.136587 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtndd\" (UniqueName: \"kubernetes.io/projected/ed5961a3-6529-4866-ba46-acc8e57c0dc1-kube-api-access-mtndd\") pod \"keystone-bootstrap-fnzlk\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.259419 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.672862 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fnzlk"] Dec 16 08:11:49 crc kubenswrapper[4789]: W1216 08:11:49.674108 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded5961a3_6529_4866_ba46_acc8e57c0dc1.slice/crio-8a0a9777c2f67944de64fd5963358d86e03cee6199fa6a5ef5d048f5613b50f1 WatchSource:0}: Error finding container 8a0a9777c2f67944de64fd5963358d86e03cee6199fa6a5ef5d048f5613b50f1: Status 404 returned error can't find the container with id 8a0a9777c2f67944de64fd5963358d86e03cee6199fa6a5ef5d048f5613b50f1 Dec 16 08:11:49 crc kubenswrapper[4789]: I1216 08:11:49.763506 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fnzlk" event={"ID":"ed5961a3-6529-4866-ba46-acc8e57c0dc1","Type":"ContainerStarted","Data":"8a0a9777c2f67944de64fd5963358d86e03cee6199fa6a5ef5d048f5613b50f1"} Dec 16 08:11:50 crc kubenswrapper[4789]: I1216 08:11:50.116860 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee964bd-39c0-422c-b883-8aad701331ae" path="/var/lib/kubelet/pods/1ee964bd-39c0-422c-b883-8aad701331ae/volumes" Dec 16 08:11:50 crc kubenswrapper[4789]: I1216 08:11:50.774168 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fnzlk" event={"ID":"ed5961a3-6529-4866-ba46-acc8e57c0dc1","Type":"ContainerStarted","Data":"b45e8cdaac36c45fe8479f84d146a4c0bdc4e8d6add454265fd7f249755971f4"} Dec 16 08:11:50 crc kubenswrapper[4789]: I1216 08:11:50.798741 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fnzlk" podStartSLOduration=2.798721265 podStartE2EDuration="2.798721265s" podCreationTimestamp="2025-12-16 08:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:11:50.793259411 +0000 UTC m=+4849.055147060" watchObservedRunningTime="2025-12-16 08:11:50.798721265 +0000 UTC m=+4849.060608894" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.260120 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.312116 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4fffdc5f-4mkt4"] Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.312562 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" podUID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerName="dnsmasq-dns" containerID="cri-o://87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6" gracePeriod=10 Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.764987 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.789502 4789 generic.go:334] "Generic (PLEG): container finished" podID="ed5961a3-6529-4866-ba46-acc8e57c0dc1" containerID="b45e8cdaac36c45fe8479f84d146a4c0bdc4e8d6add454265fd7f249755971f4" exitCode=0 Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.789544 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fnzlk" event={"ID":"ed5961a3-6529-4866-ba46-acc8e57c0dc1","Type":"ContainerDied","Data":"b45e8cdaac36c45fe8479f84d146a4c0bdc4e8d6add454265fd7f249755971f4"} Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.817568 4789 generic.go:334] "Generic (PLEG): container finished" podID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerID="87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6" exitCode=0 Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.817608 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" event={"ID":"f39411a9-1613-48ed-9806-9433b9b22ba5","Type":"ContainerDied","Data":"87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6"} Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.817633 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" event={"ID":"f39411a9-1613-48ed-9806-9433b9b22ba5","Type":"ContainerDied","Data":"a684fa63c0e96e73e8a115c73df789c16a3b1c6fbc73fe416ea7d5d55a408132"} Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.817662 4789 scope.go:117] "RemoveContainer" containerID="87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.817810 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4fffdc5f-4mkt4" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.854350 4789 scope.go:117] "RemoveContainer" containerID="bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.885700 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tc67\" (UniqueName: \"kubernetes.io/projected/f39411a9-1613-48ed-9806-9433b9b22ba5-kube-api-access-8tc67\") pod \"f39411a9-1613-48ed-9806-9433b9b22ba5\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.885897 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-nb\") pod \"f39411a9-1613-48ed-9806-9433b9b22ba5\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.886175 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-config\") pod \"f39411a9-1613-48ed-9806-9433b9b22ba5\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.886255 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-sb\") pod \"f39411a9-1613-48ed-9806-9433b9b22ba5\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.886309 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-dns-svc\") pod \"f39411a9-1613-48ed-9806-9433b9b22ba5\" (UID: \"f39411a9-1613-48ed-9806-9433b9b22ba5\") " Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.891463 4789 scope.go:117] "RemoveContainer" containerID="87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6" Dec 16 08:11:52 crc kubenswrapper[4789]: E1216 08:11:52.901104 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6\": container with ID starting with 87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6 not found: ID does not exist" containerID="87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.901160 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6"} err="failed to get container status \"87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6\": rpc error: code = NotFound desc = could not find container \"87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6\": container with ID starting with 87e334bb1bc2ab8b112c57fe95c459dd8c541974ded259ddebaebea42bdf05c6 not found: ID does not exist" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.901190 4789 scope.go:117] "RemoveContainer" containerID="bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e" Dec 16 08:11:52 crc kubenswrapper[4789]: E1216 08:11:52.909620 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e\": container with ID starting with bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e not found: ID does not exist" containerID="bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.909672 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e"} err="failed to get container status \"bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e\": rpc error: code = NotFound desc = could not find container \"bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e\": container with ID starting with bb519630428d8d937bc935e8a04b2929362ead9c0cc460c5065088a78fbfa27e not found: ID does not exist" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.915387 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39411a9-1613-48ed-9806-9433b9b22ba5-kube-api-access-8tc67" (OuterVolumeSpecName: "kube-api-access-8tc67") pod "f39411a9-1613-48ed-9806-9433b9b22ba5" (UID: "f39411a9-1613-48ed-9806-9433b9b22ba5"). InnerVolumeSpecName "kube-api-access-8tc67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.960852 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f39411a9-1613-48ed-9806-9433b9b22ba5" (UID: "f39411a9-1613-48ed-9806-9433b9b22ba5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.988813 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:52 crc kubenswrapper[4789]: I1216 08:11:52.988849 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tc67\" (UniqueName: \"kubernetes.io/projected/f39411a9-1613-48ed-9806-9433b9b22ba5-kube-api-access-8tc67\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.002928 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-config" (OuterVolumeSpecName: "config") pod "f39411a9-1613-48ed-9806-9433b9b22ba5" (UID: "f39411a9-1613-48ed-9806-9433b9b22ba5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.003159 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f39411a9-1613-48ed-9806-9433b9b22ba5" (UID: "f39411a9-1613-48ed-9806-9433b9b22ba5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.020362 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f39411a9-1613-48ed-9806-9433b9b22ba5" (UID: "f39411a9-1613-48ed-9806-9433b9b22ba5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.090789 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.090824 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.090834 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f39411a9-1613-48ed-9806-9433b9b22ba5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.147640 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4fffdc5f-4mkt4"] Dec 16 08:11:53 crc kubenswrapper[4789]: I1216 08:11:53.154170 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b4fffdc5f-4mkt4"] Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.116733 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39411a9-1613-48ed-9806-9433b9b22ba5" path="/var/lib/kubelet/pods/f39411a9-1613-48ed-9806-9433b9b22ba5/volumes" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.139409 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.211657 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-combined-ca-bundle\") pod \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.211746 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-fernet-keys\") pod \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.211867 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtndd\" (UniqueName: \"kubernetes.io/projected/ed5961a3-6529-4866-ba46-acc8e57c0dc1-kube-api-access-mtndd\") pod \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.212005 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-credential-keys\") pod \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.212164 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-scripts\") pod \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.212264 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-config-data\") pod \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\" (UID: \"ed5961a3-6529-4866-ba46-acc8e57c0dc1\") " Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.216694 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed5961a3-6529-4866-ba46-acc8e57c0dc1" (UID: "ed5961a3-6529-4866-ba46-acc8e57c0dc1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.216778 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5961a3-6529-4866-ba46-acc8e57c0dc1-kube-api-access-mtndd" (OuterVolumeSpecName: "kube-api-access-mtndd") pod "ed5961a3-6529-4866-ba46-acc8e57c0dc1" (UID: "ed5961a3-6529-4866-ba46-acc8e57c0dc1"). InnerVolumeSpecName "kube-api-access-mtndd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.216766 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed5961a3-6529-4866-ba46-acc8e57c0dc1" (UID: "ed5961a3-6529-4866-ba46-acc8e57c0dc1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.226193 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-scripts" (OuterVolumeSpecName: "scripts") pod "ed5961a3-6529-4866-ba46-acc8e57c0dc1" (UID: "ed5961a3-6529-4866-ba46-acc8e57c0dc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.236495 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed5961a3-6529-4866-ba46-acc8e57c0dc1" (UID: "ed5961a3-6529-4866-ba46-acc8e57c0dc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.245247 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-config-data" (OuterVolumeSpecName: "config-data") pod "ed5961a3-6529-4866-ba46-acc8e57c0dc1" (UID: "ed5961a3-6529-4866-ba46-acc8e57c0dc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.314595 4789 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.314713 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.314727 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.314737 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.314751 4789 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed5961a3-6529-4866-ba46-acc8e57c0dc1-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.314762 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtndd\" (UniqueName: \"kubernetes.io/projected/ed5961a3-6529-4866-ba46-acc8e57c0dc1-kube-api-access-mtndd\") on node \"crc\" DevicePath \"\"" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.840748 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fnzlk" event={"ID":"ed5961a3-6529-4866-ba46-acc8e57c0dc1","Type":"ContainerDied","Data":"8a0a9777c2f67944de64fd5963358d86e03cee6199fa6a5ef5d048f5613b50f1"} Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.841664 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a0a9777c2f67944de64fd5963358d86e03cee6199fa6a5ef5d048f5613b50f1" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.841775 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fnzlk" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.908824 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6c9ff75b57-xt8cn"] Dec 16 08:11:54 crc kubenswrapper[4789]: E1216 08:11:54.909639 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerName="dnsmasq-dns" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.909670 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerName="dnsmasq-dns" Dec 16 08:11:54 crc kubenswrapper[4789]: E1216 08:11:54.909693 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5961a3-6529-4866-ba46-acc8e57c0dc1" containerName="keystone-bootstrap" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.909702 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5961a3-6529-4866-ba46-acc8e57c0dc1" containerName="keystone-bootstrap" Dec 16 08:11:54 crc kubenswrapper[4789]: E1216 08:11:54.909711 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerName="init" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.909720 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerName="init" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.909942 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5961a3-6529-4866-ba46-acc8e57c0dc1" containerName="keystone-bootstrap" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.909967 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39411a9-1613-48ed-9806-9433b9b22ba5" containerName="dnsmasq-dns" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.910667 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.913069 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkpmb" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.913181 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.913576 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.916121 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 08:11:54 crc kubenswrapper[4789]: I1216 08:11:54.929981 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6c9ff75b57-xt8cn"] Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.026208 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdwn\" (UniqueName: \"kubernetes.io/projected/88b34b95-4592-4ddc-ac54-686f169961d0-kube-api-access-ccdwn\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.026253 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-credential-keys\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.026321 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-fernet-keys\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.026416 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-config-data\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.026435 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-scripts\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.026453 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-combined-ca-bundle\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.128627 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdwn\" (UniqueName: \"kubernetes.io/projected/88b34b95-4592-4ddc-ac54-686f169961d0-kube-api-access-ccdwn\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.128747 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-credential-keys\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.128853 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-fernet-keys\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.128998 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-config-data\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.129059 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-scripts\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.129108 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-combined-ca-bundle\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.133795 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-config-data\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.133837 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-scripts\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.133968 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-fernet-keys\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.136465 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-combined-ca-bundle\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.141992 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/88b34b95-4592-4ddc-ac54-686f169961d0-credential-keys\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.147717 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdwn\" (UniqueName: \"kubernetes.io/projected/88b34b95-4592-4ddc-ac54-686f169961d0-kube-api-access-ccdwn\") pod \"keystone-6c9ff75b57-xt8cn\" (UID: \"88b34b95-4592-4ddc-ac54-686f169961d0\") " pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.228393 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.744677 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6c9ff75b57-xt8cn"] Dec 16 08:11:55 crc kubenswrapper[4789]: I1216 08:11:55.852285 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c9ff75b57-xt8cn" event={"ID":"88b34b95-4592-4ddc-ac54-686f169961d0","Type":"ContainerStarted","Data":"9c1079abf3f28c250ad088553313dc28148bb3330dd5bf5fb1b011cd1d7bea45"} Dec 16 08:11:56 crc kubenswrapper[4789]: I1216 08:11:56.899989 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6c9ff75b57-xt8cn" event={"ID":"88b34b95-4592-4ddc-ac54-686f169961d0","Type":"ContainerStarted","Data":"5d3f52c6f1f9c30740a415721a5277380e17bb6f4483518d7a6a7a374b760ae4"} Dec 16 08:11:56 crc kubenswrapper[4789]: I1216 08:11:56.900356 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:12:02 crc kubenswrapper[4789]: I1216 08:12:02.113712 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:12:02 crc kubenswrapper[4789]: E1216 08:12:02.114542 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:12:13 crc kubenswrapper[4789]: I1216 08:12:13.105424 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:12:13 crc kubenswrapper[4789]: E1216 08:12:13.106218 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:12:26 crc kubenswrapper[4789]: I1216 08:12:26.105757 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:12:26 crc kubenswrapper[4789]: I1216 08:12:26.644813 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6c9ff75b57-xt8cn" Dec 16 08:12:26 crc kubenswrapper[4789]: I1216 08:12:26.681249 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6c9ff75b57-xt8cn" podStartSLOduration=32.681215689 podStartE2EDuration="32.681215689s" podCreationTimestamp="2025-12-16 08:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:11:56.93429803 +0000 UTC m=+4855.196185659" watchObservedRunningTime="2025-12-16 08:12:26.681215689 +0000 UTC m=+4884.943103318" Dec 16 08:12:27 crc kubenswrapper[4789]: I1216 08:12:27.128153 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"263dc2616d80f4ff03c2ff79d40529ccbb3a132a477a0c3fb859bf304de469fc"} Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.469206 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.471311 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.477883 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zwjxk" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.478244 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.479438 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.486052 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.629895 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58061711-79cd-4f55-946d-808fc5787077-openstack-config-secret\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.630059 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7qn\" (UniqueName: \"kubernetes.io/projected/58061711-79cd-4f55-946d-808fc5787077-kube-api-access-gn7qn\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.630338 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58061711-79cd-4f55-946d-808fc5787077-openstack-config\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.731827 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7qn\" (UniqueName: \"kubernetes.io/projected/58061711-79cd-4f55-946d-808fc5787077-kube-api-access-gn7qn\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.732122 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58061711-79cd-4f55-946d-808fc5787077-openstack-config\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.733554 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58061711-79cd-4f55-946d-808fc5787077-openstack-config\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.733819 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58061711-79cd-4f55-946d-808fc5787077-openstack-config-secret\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.743660 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58061711-79cd-4f55-946d-808fc5787077-openstack-config-secret\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.751873 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7qn\" (UniqueName: \"kubernetes.io/projected/58061711-79cd-4f55-946d-808fc5787077-kube-api-access-gn7qn\") pod \"openstackclient\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " pod="openstack/openstackclient" Dec 16 08:12:31 crc kubenswrapper[4789]: I1216 08:12:31.797977 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:12:32 crc kubenswrapper[4789]: I1216 08:12:32.219510 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 08:12:33 crc kubenswrapper[4789]: I1216 08:12:33.183701 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"58061711-79cd-4f55-946d-808fc5787077","Type":"ContainerStarted","Data":"266fbb71692ec0f80f54f2a0c5f1906f8ed38e2df0277aee922d9154e9c6b30a"} Dec 16 08:12:43 crc kubenswrapper[4789]: I1216 08:12:43.263046 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"58061711-79cd-4f55-946d-808fc5787077","Type":"ContainerStarted","Data":"41ecbc1741748ef28f56fd4730b31723aa751a4ab2e121ef8653900d8ed3b220"} Dec 16 08:12:43 crc kubenswrapper[4789]: I1216 08:12:43.283316 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.413156616 podStartE2EDuration="12.283298877s" podCreationTimestamp="2025-12-16 08:12:31 +0000 UTC" firstStartedPulling="2025-12-16 08:12:32.234146637 +0000 UTC m=+4890.496034266" lastFinishedPulling="2025-12-16 08:12:42.104288898 +0000 UTC m=+4900.366176527" observedRunningTime="2025-12-16 08:12:43.279317841 +0000 UTC m=+4901.541205480" watchObservedRunningTime="2025-12-16 08:12:43.283298877 +0000 UTC m=+4901.545186526" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.636975 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kc9t8"] Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.639161 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.643772 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kc9t8"] Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.673710 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-utilities\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.673744 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-catalog-content\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.673811 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbzs\" (UniqueName: \"kubernetes.io/projected/e1257e46-f6db-4535-b4e4-43311d7bb414-kube-api-access-2rbzs\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.775320 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-utilities\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.775361 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-catalog-content\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.775440 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbzs\" (UniqueName: \"kubernetes.io/projected/e1257e46-f6db-4535-b4e4-43311d7bb414-kube-api-access-2rbzs\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.775841 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-utilities\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.775998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-catalog-content\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.800703 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbzs\" (UniqueName: \"kubernetes.io/projected/e1257e46-f6db-4535-b4e4-43311d7bb414-kube-api-access-2rbzs\") pod \"certified-operators-kc9t8\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:18 crc kubenswrapper[4789]: I1216 08:13:18.977208 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:19 crc kubenswrapper[4789]: I1216 08:13:19.501846 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kc9t8"] Dec 16 08:13:19 crc kubenswrapper[4789]: I1216 08:13:19.643040 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kc9t8" event={"ID":"e1257e46-f6db-4535-b4e4-43311d7bb414","Type":"ContainerStarted","Data":"6e962ad2dcce869dc407b5e223024034d5a30f6209dd5528e13f91614bc045ad"} Dec 16 08:13:20 crc kubenswrapper[4789]: I1216 08:13:20.652654 4789 generic.go:334] "Generic (PLEG): container finished" podID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerID="27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158" exitCode=0 Dec 16 08:13:20 crc kubenswrapper[4789]: I1216 08:13:20.652704 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kc9t8" event={"ID":"e1257e46-f6db-4535-b4e4-43311d7bb414","Type":"ContainerDied","Data":"27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158"} Dec 16 08:13:21 crc kubenswrapper[4789]: I1216 08:13:21.666142 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kc9t8" event={"ID":"e1257e46-f6db-4535-b4e4-43311d7bb414","Type":"ContainerStarted","Data":"d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727"} Dec 16 08:13:22 crc kubenswrapper[4789]: I1216 08:13:22.675857 4789 generic.go:334] "Generic (PLEG): container finished" podID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerID="d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727" exitCode=0 Dec 16 08:13:22 crc kubenswrapper[4789]: I1216 08:13:22.675994 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kc9t8" event={"ID":"e1257e46-f6db-4535-b4e4-43311d7bb414","Type":"ContainerDied","Data":"d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727"} Dec 16 08:13:23 crc kubenswrapper[4789]: I1216 08:13:23.685660 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kc9t8" event={"ID":"e1257e46-f6db-4535-b4e4-43311d7bb414","Type":"ContainerStarted","Data":"c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55"} Dec 16 08:13:23 crc kubenswrapper[4789]: I1216 08:13:23.705273 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kc9t8" podStartSLOduration=3.1314726139999998 podStartE2EDuration="5.705252975s" podCreationTimestamp="2025-12-16 08:13:18 +0000 UTC" firstStartedPulling="2025-12-16 08:13:20.655046472 +0000 UTC m=+4938.916934101" lastFinishedPulling="2025-12-16 08:13:23.228826823 +0000 UTC m=+4941.490714462" observedRunningTime="2025-12-16 08:13:23.70135884 +0000 UTC m=+4941.963246469" watchObservedRunningTime="2025-12-16 08:13:23.705252975 +0000 UTC m=+4941.967140594" Dec 16 08:13:28 crc kubenswrapper[4789]: I1216 08:13:28.977809 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:28 crc kubenswrapper[4789]: I1216 08:13:28.979390 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:29 crc kubenswrapper[4789]: I1216 08:13:29.051150 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:29 crc kubenswrapper[4789]: I1216 08:13:29.385638 4789 scope.go:117] "RemoveContainer" containerID="60e755ba1242bfe4598b34685476e17005acf5d2bf04a5e0e82f9adeed061b28" Dec 16 08:13:29 crc kubenswrapper[4789]: I1216 08:13:29.769977 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:29 crc kubenswrapper[4789]: I1216 08:13:29.827944 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kc9t8"] Dec 16 08:13:31 crc kubenswrapper[4789]: I1216 08:13:31.747624 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kc9t8" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="registry-server" containerID="cri-o://c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55" gracePeriod=2 Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.296600 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.386779 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rbzs\" (UniqueName: \"kubernetes.io/projected/e1257e46-f6db-4535-b4e4-43311d7bb414-kube-api-access-2rbzs\") pod \"e1257e46-f6db-4535-b4e4-43311d7bb414\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.386874 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-utilities\") pod \"e1257e46-f6db-4535-b4e4-43311d7bb414\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.387027 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-catalog-content\") pod \"e1257e46-f6db-4535-b4e4-43311d7bb414\" (UID: \"e1257e46-f6db-4535-b4e4-43311d7bb414\") " Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.387980 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-utilities" (OuterVolumeSpecName: "utilities") pod "e1257e46-f6db-4535-b4e4-43311d7bb414" (UID: "e1257e46-f6db-4535-b4e4-43311d7bb414"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.393432 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1257e46-f6db-4535-b4e4-43311d7bb414-kube-api-access-2rbzs" (OuterVolumeSpecName: "kube-api-access-2rbzs") pod "e1257e46-f6db-4535-b4e4-43311d7bb414" (UID: "e1257e46-f6db-4535-b4e4-43311d7bb414"). InnerVolumeSpecName "kube-api-access-2rbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.445448 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1257e46-f6db-4535-b4e4-43311d7bb414" (UID: "e1257e46-f6db-4535-b4e4-43311d7bb414"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.488316 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.488353 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rbzs\" (UniqueName: \"kubernetes.io/projected/e1257e46-f6db-4535-b4e4-43311d7bb414-kube-api-access-2rbzs\") on node \"crc\" DevicePath \"\"" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.488365 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1257e46-f6db-4535-b4e4-43311d7bb414-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.758356 4789 generic.go:334] "Generic (PLEG): container finished" podID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerID="c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55" exitCode=0 Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.758403 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kc9t8" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.758407 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kc9t8" event={"ID":"e1257e46-f6db-4535-b4e4-43311d7bb414","Type":"ContainerDied","Data":"c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55"} Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.758582 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kc9t8" event={"ID":"e1257e46-f6db-4535-b4e4-43311d7bb414","Type":"ContainerDied","Data":"6e962ad2dcce869dc407b5e223024034d5a30f6209dd5528e13f91614bc045ad"} Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.758634 4789 scope.go:117] "RemoveContainer" containerID="c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.795149 4789 scope.go:117] "RemoveContainer" containerID="d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.797264 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kc9t8"] Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.809522 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kc9t8"] Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.823515 4789 scope.go:117] "RemoveContainer" containerID="27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.853195 4789 scope.go:117] "RemoveContainer" containerID="c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55" Dec 16 08:13:32 crc kubenswrapper[4789]: E1216 08:13:32.853582 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55\": container with ID starting with c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55 not found: ID does not exist" containerID="c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.853626 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55"} err="failed to get container status \"c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55\": rpc error: code = NotFound desc = could not find container \"c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55\": container with ID starting with c9d0a33e9953d608b01a7e1aeb7084f3fe753f256751527da7e4b25334668d55 not found: ID does not exist" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.853670 4789 scope.go:117] "RemoveContainer" containerID="d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727" Dec 16 08:13:32 crc kubenswrapper[4789]: E1216 08:13:32.854113 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727\": container with ID starting with d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727 not found: ID does not exist" containerID="d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.854246 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727"} err="failed to get container status \"d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727\": rpc error: code = NotFound desc = could not find container \"d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727\": container with ID starting with d3a8b059fceb148c5be89f6e53c0d68b5f0e02466b3f36dfa30fe9928d4eb727 not found: ID does not exist" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.854294 4789 scope.go:117] "RemoveContainer" containerID="27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158" Dec 16 08:13:32 crc kubenswrapper[4789]: E1216 08:13:32.854535 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158\": container with ID starting with 27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158 not found: ID does not exist" containerID="27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158" Dec 16 08:13:32 crc kubenswrapper[4789]: I1216 08:13:32.854557 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158"} err="failed to get container status \"27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158\": rpc error: code = NotFound desc = could not find container \"27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158\": container with ID starting with 27b5ac0d844768ddc91876434a9e5e76703f3e01c9d5a764c07e7b0ad3d74158 not found: ID does not exist" Dec 16 08:13:34 crc kubenswrapper[4789]: I1216 08:13:34.120667 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" path="/var/lib/kubelet/pods/e1257e46-f6db-4535-b4e4-43311d7bb414/volumes" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.548652 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wb5xw"] Dec 16 08:14:04 crc kubenswrapper[4789]: E1216 08:14:04.549543 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="extract-content" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.549560 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="extract-content" Dec 16 08:14:04 crc kubenswrapper[4789]: E1216 08:14:04.549575 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="registry-server" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.549581 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="registry-server" Dec 16 08:14:04 crc kubenswrapper[4789]: E1216 08:14:04.549602 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="extract-utilities" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.549609 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="extract-utilities" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.549787 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1257e46-f6db-4535-b4e4-43311d7bb414" containerName="registry-server" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.558712 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wb5xw"] Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.558871 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.635962 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f6889d-889d-411f-be17-dcf5d7189a24-operator-scripts\") pod \"barbican-db-create-wb5xw\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.636036 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7pn\" (UniqueName: \"kubernetes.io/projected/e0f6889d-889d-411f-be17-dcf5d7189a24-kube-api-access-fb7pn\") pod \"barbican-db-create-wb5xw\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.653956 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-406b-account-create-update-c2wxr"] Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.655162 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.658070 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.661395 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-406b-account-create-update-c2wxr"] Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.737454 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f6889d-889d-411f-be17-dcf5d7189a24-operator-scripts\") pod \"barbican-db-create-wb5xw\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.737520 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7pn\" (UniqueName: \"kubernetes.io/projected/e0f6889d-889d-411f-be17-dcf5d7189a24-kube-api-access-fb7pn\") pod \"barbican-db-create-wb5xw\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.737554 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-operator-scripts\") pod \"barbican-406b-account-create-update-c2wxr\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.737600 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5qwq\" (UniqueName: \"kubernetes.io/projected/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-kube-api-access-b5qwq\") pod \"barbican-406b-account-create-update-c2wxr\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.738360 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f6889d-889d-411f-be17-dcf5d7189a24-operator-scripts\") pod \"barbican-db-create-wb5xw\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.773232 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7pn\" (UniqueName: \"kubernetes.io/projected/e0f6889d-889d-411f-be17-dcf5d7189a24-kube-api-access-fb7pn\") pod \"barbican-db-create-wb5xw\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.838989 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-operator-scripts\") pod \"barbican-406b-account-create-update-c2wxr\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.839432 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5qwq\" (UniqueName: \"kubernetes.io/projected/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-kube-api-access-b5qwq\") pod \"barbican-406b-account-create-update-c2wxr\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.840555 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-operator-scripts\") pod \"barbican-406b-account-create-update-c2wxr\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.855575 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5qwq\" (UniqueName: \"kubernetes.io/projected/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-kube-api-access-b5qwq\") pod \"barbican-406b-account-create-update-c2wxr\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:04 crc kubenswrapper[4789]: I1216 08:14:04.877613 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:05 crc kubenswrapper[4789]: I1216 08:14:05.025063 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:05 crc kubenswrapper[4789]: I1216 08:14:05.359388 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wb5xw"] Dec 16 08:14:05 crc kubenswrapper[4789]: I1216 08:14:05.475534 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-406b-account-create-update-c2wxr"] Dec 16 08:14:05 crc kubenswrapper[4789]: W1216 08:14:05.481767 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce88088b_1df2_439e_a1c6_9ff81ac4c86d.slice/crio-19323a47a2d0cdc75f88472651b0b3d293240d53edc3a57b83e79e690337c787 WatchSource:0}: Error finding container 19323a47a2d0cdc75f88472651b0b3d293240d53edc3a57b83e79e690337c787: Status 404 returned error can't find the container with id 19323a47a2d0cdc75f88472651b0b3d293240d53edc3a57b83e79e690337c787 Dec 16 08:14:06 crc kubenswrapper[4789]: I1216 08:14:06.057147 4789 generic.go:334] "Generic (PLEG): container finished" podID="e0f6889d-889d-411f-be17-dcf5d7189a24" containerID="a6b75627db7c0e28090161e5d7c0c6b8e3a8c0bf761d9d4bd0529b47f75cf842" exitCode=0 Dec 16 08:14:06 crc kubenswrapper[4789]: I1216 08:14:06.057258 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wb5xw" event={"ID":"e0f6889d-889d-411f-be17-dcf5d7189a24","Type":"ContainerDied","Data":"a6b75627db7c0e28090161e5d7c0c6b8e3a8c0bf761d9d4bd0529b47f75cf842"} Dec 16 08:14:06 crc kubenswrapper[4789]: I1216 08:14:06.057289 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wb5xw" event={"ID":"e0f6889d-889d-411f-be17-dcf5d7189a24","Type":"ContainerStarted","Data":"d6cb7f155d4ff92bad55360b23ec730055e8ea7624cd9c68c23c426492b7cd58"} Dec 16 08:14:06 crc kubenswrapper[4789]: I1216 08:14:06.061111 4789 generic.go:334] "Generic (PLEG): container finished" podID="ce88088b-1df2-439e-a1c6-9ff81ac4c86d" containerID="eb7aaff6096b1844d408960c85089bfbbcdea51e4a75c3300eba39f2b5bfed56" exitCode=0 Dec 16 08:14:06 crc kubenswrapper[4789]: I1216 08:14:06.061148 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-406b-account-create-update-c2wxr" event={"ID":"ce88088b-1df2-439e-a1c6-9ff81ac4c86d","Type":"ContainerDied","Data":"eb7aaff6096b1844d408960c85089bfbbcdea51e4a75c3300eba39f2b5bfed56"} Dec 16 08:14:06 crc kubenswrapper[4789]: I1216 08:14:06.061171 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-406b-account-create-update-c2wxr" event={"ID":"ce88088b-1df2-439e-a1c6-9ff81ac4c86d","Type":"ContainerStarted","Data":"19323a47a2d0cdc75f88472651b0b3d293240d53edc3a57b83e79e690337c787"} Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.428117 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.437489 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.490439 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5qwq\" (UniqueName: \"kubernetes.io/projected/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-kube-api-access-b5qwq\") pod \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.490506 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-operator-scripts\") pod \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\" (UID: \"ce88088b-1df2-439e-a1c6-9ff81ac4c86d\") " Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.490524 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f6889d-889d-411f-be17-dcf5d7189a24-operator-scripts\") pod \"e0f6889d-889d-411f-be17-dcf5d7189a24\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.490544 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb7pn\" (UniqueName: \"kubernetes.io/projected/e0f6889d-889d-411f-be17-dcf5d7189a24-kube-api-access-fb7pn\") pod \"e0f6889d-889d-411f-be17-dcf5d7189a24\" (UID: \"e0f6889d-889d-411f-be17-dcf5d7189a24\") " Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.491075 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce88088b-1df2-439e-a1c6-9ff81ac4c86d" (UID: "ce88088b-1df2-439e-a1c6-9ff81ac4c86d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.491345 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f6889d-889d-411f-be17-dcf5d7189a24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0f6889d-889d-411f-be17-dcf5d7189a24" (UID: "e0f6889d-889d-411f-be17-dcf5d7189a24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.496216 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f6889d-889d-411f-be17-dcf5d7189a24-kube-api-access-fb7pn" (OuterVolumeSpecName: "kube-api-access-fb7pn") pod "e0f6889d-889d-411f-be17-dcf5d7189a24" (UID: "e0f6889d-889d-411f-be17-dcf5d7189a24"). InnerVolumeSpecName "kube-api-access-fb7pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.496684 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-kube-api-access-b5qwq" (OuterVolumeSpecName: "kube-api-access-b5qwq") pod "ce88088b-1df2-439e-a1c6-9ff81ac4c86d" (UID: "ce88088b-1df2-439e-a1c6-9ff81ac4c86d"). InnerVolumeSpecName "kube-api-access-b5qwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.592998 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5qwq\" (UniqueName: \"kubernetes.io/projected/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-kube-api-access-b5qwq\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.593031 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce88088b-1df2-439e-a1c6-9ff81ac4c86d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.593043 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0f6889d-889d-411f-be17-dcf5d7189a24-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:07 crc kubenswrapper[4789]: I1216 08:14:07.593054 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb7pn\" (UniqueName: \"kubernetes.io/projected/e0f6889d-889d-411f-be17-dcf5d7189a24-kube-api-access-fb7pn\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:08 crc kubenswrapper[4789]: I1216 08:14:08.083904 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wb5xw" event={"ID":"e0f6889d-889d-411f-be17-dcf5d7189a24","Type":"ContainerDied","Data":"d6cb7f155d4ff92bad55360b23ec730055e8ea7624cd9c68c23c426492b7cd58"} Dec 16 08:14:08 crc kubenswrapper[4789]: I1216 08:14:08.084212 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6cb7f155d4ff92bad55360b23ec730055e8ea7624cd9c68c23c426492b7cd58" Dec 16 08:14:08 crc kubenswrapper[4789]: I1216 08:14:08.084107 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wb5xw" Dec 16 08:14:08 crc kubenswrapper[4789]: I1216 08:14:08.087664 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-406b-account-create-update-c2wxr" event={"ID":"ce88088b-1df2-439e-a1c6-9ff81ac4c86d","Type":"ContainerDied","Data":"19323a47a2d0cdc75f88472651b0b3d293240d53edc3a57b83e79e690337c787"} Dec 16 08:14:08 crc kubenswrapper[4789]: I1216 08:14:08.087699 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19323a47a2d0cdc75f88472651b0b3d293240d53edc3a57b83e79e690337c787" Dec 16 08:14:08 crc kubenswrapper[4789]: I1216 08:14:08.087703 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-406b-account-create-update-c2wxr" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.893790 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-96q8n"] Dec 16 08:14:09 crc kubenswrapper[4789]: E1216 08:14:09.894370 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f6889d-889d-411f-be17-dcf5d7189a24" containerName="mariadb-database-create" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.894382 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f6889d-889d-411f-be17-dcf5d7189a24" containerName="mariadb-database-create" Dec 16 08:14:09 crc kubenswrapper[4789]: E1216 08:14:09.894406 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce88088b-1df2-439e-a1c6-9ff81ac4c86d" containerName="mariadb-account-create-update" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.894412 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce88088b-1df2-439e-a1c6-9ff81ac4c86d" containerName="mariadb-account-create-update" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.894558 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce88088b-1df2-439e-a1c6-9ff81ac4c86d" containerName="mariadb-account-create-update" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.894576 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f6889d-889d-411f-be17-dcf5d7189a24" containerName="mariadb-database-create" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.895189 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.899442 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q6tjp" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.899674 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.914024 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-96q8n"] Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.931853 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvq2r\" (UniqueName: \"kubernetes.io/projected/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-kube-api-access-fvq2r\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.931901 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-db-sync-config-data\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:09 crc kubenswrapper[4789]: I1216 08:14:09.932070 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-combined-ca-bundle\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.033927 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvq2r\" (UniqueName: \"kubernetes.io/projected/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-kube-api-access-fvq2r\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.033977 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-db-sync-config-data\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.034050 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-combined-ca-bundle\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.040582 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-combined-ca-bundle\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.050407 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-db-sync-config-data\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.051297 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvq2r\" (UniqueName: \"kubernetes.io/projected/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-kube-api-access-fvq2r\") pod \"barbican-db-sync-96q8n\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.219836 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:10 crc kubenswrapper[4789]: I1216 08:14:10.750126 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-96q8n"] Dec 16 08:14:11 crc kubenswrapper[4789]: I1216 08:14:11.111306 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96q8n" event={"ID":"6a3f7fc5-bcab-4dd2-a144-a075499d6a12","Type":"ContainerStarted","Data":"13f2e0b4d3f989a57b2f3b77283a94c09f24da38f9df3d4096b43ecdb6d38ee4"} Dec 16 08:14:18 crc kubenswrapper[4789]: I1216 08:14:18.177841 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96q8n" event={"ID":"6a3f7fc5-bcab-4dd2-a144-a075499d6a12","Type":"ContainerStarted","Data":"df31c7050a286d36f3cb0e1122242b4802ed650e327459774199353e29e7c125"} Dec 16 08:14:19 crc kubenswrapper[4789]: I1216 08:14:19.188402 4789 generic.go:334] "Generic (PLEG): container finished" podID="6a3f7fc5-bcab-4dd2-a144-a075499d6a12" containerID="df31c7050a286d36f3cb0e1122242b4802ed650e327459774199353e29e7c125" exitCode=0 Dec 16 08:14:19 crc kubenswrapper[4789]: I1216 08:14:19.188448 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96q8n" event={"ID":"6a3f7fc5-bcab-4dd2-a144-a075499d6a12","Type":"ContainerDied","Data":"df31c7050a286d36f3cb0e1122242b4802ed650e327459774199353e29e7c125"} Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.554095 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.630886 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-combined-ca-bundle\") pod \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.631073 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-db-sync-config-data\") pod \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.631161 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvq2r\" (UniqueName: \"kubernetes.io/projected/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-kube-api-access-fvq2r\") pod \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\" (UID: \"6a3f7fc5-bcab-4dd2-a144-a075499d6a12\") " Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.636860 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6a3f7fc5-bcab-4dd2-a144-a075499d6a12" (UID: "6a3f7fc5-bcab-4dd2-a144-a075499d6a12"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.637330 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-kube-api-access-fvq2r" (OuterVolumeSpecName: "kube-api-access-fvq2r") pod "6a3f7fc5-bcab-4dd2-a144-a075499d6a12" (UID: "6a3f7fc5-bcab-4dd2-a144-a075499d6a12"). InnerVolumeSpecName "kube-api-access-fvq2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.654981 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a3f7fc5-bcab-4dd2-a144-a075499d6a12" (UID: "6a3f7fc5-bcab-4dd2-a144-a075499d6a12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.732805 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvq2r\" (UniqueName: \"kubernetes.io/projected/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-kube-api-access-fvq2r\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.732846 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:20 crc kubenswrapper[4789]: I1216 08:14:20.732858 4789 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6a3f7fc5-bcab-4dd2-a144-a075499d6a12-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.212046 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96q8n" event={"ID":"6a3f7fc5-bcab-4dd2-a144-a075499d6a12","Type":"ContainerDied","Data":"13f2e0b4d3f989a57b2f3b77283a94c09f24da38f9df3d4096b43ecdb6d38ee4"} Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.212086 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13f2e0b4d3f989a57b2f3b77283a94c09f24da38f9df3d4096b43ecdb6d38ee4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.212195 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96q8n" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.459389 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d594f4d55-cjsm4"] Dec 16 08:14:21 crc kubenswrapper[4789]: E1216 08:14:21.459836 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3f7fc5-bcab-4dd2-a144-a075499d6a12" containerName="barbican-db-sync" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.459858 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3f7fc5-bcab-4dd2-a144-a075499d6a12" containerName="barbican-db-sync" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.460120 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3f7fc5-bcab-4dd2-a144-a075499d6a12" containerName="barbican-db-sync" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.461200 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.466543 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q6tjp" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.471526 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.471744 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7df795b6b4-6qb9j"] Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.473424 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.473845 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.482399 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.486343 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d594f4d55-cjsm4"] Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.508547 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7df795b6b4-6qb9j"] Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547479 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-config-data-custom\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547542 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-config-data\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547579 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58a03d3a-03c2-47af-872b-2aed04c99bbc-logs\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547604 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-combined-ca-bundle\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547637 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-combined-ca-bundle\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547667 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trnjf\" (UniqueName: \"kubernetes.io/projected/06e922aa-fb79-405f-afd2-dc07a0bc8809-kube-api-access-trnjf\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547708 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-config-data-custom\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547764 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e922aa-fb79-405f-afd2-dc07a0bc8809-logs\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547800 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-config-data\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.547830 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmpzj\" (UniqueName: \"kubernetes.io/projected/58a03d3a-03c2-47af-872b-2aed04c99bbc-kube-api-access-qmpzj\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.558803 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64f998756c-btfjj"] Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.569591 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.574773 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64f998756c-btfjj"] Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649125 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-nb\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649195 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-config\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649284 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-config-data-custom\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649324 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-config-data\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649359 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58a03d3a-03c2-47af-872b-2aed04c99bbc-logs\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649384 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-combined-ca-bundle\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649416 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-combined-ca-bundle\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649439 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-dns-svc\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649469 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trnjf\" (UniqueName: \"kubernetes.io/projected/06e922aa-fb79-405f-afd2-dc07a0bc8809-kube-api-access-trnjf\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649541 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-config-data-custom\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649602 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e922aa-fb79-405f-afd2-dc07a0bc8809-logs\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649638 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-sb\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649668 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-config-data\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649704 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmpzj\" (UniqueName: \"kubernetes.io/projected/58a03d3a-03c2-47af-872b-2aed04c99bbc-kube-api-access-qmpzj\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.649729 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmldf\" (UniqueName: \"kubernetes.io/projected/185bafa0-88c8-4439-b600-e6c906face05-kube-api-access-bmldf\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.650066 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58a03d3a-03c2-47af-872b-2aed04c99bbc-logs\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.652069 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06e922aa-fb79-405f-afd2-dc07a0bc8809-logs\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.655415 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-combined-ca-bundle\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.656959 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-config-data-custom\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.659999 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-combined-ca-bundle\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.668762 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a03d3a-03c2-47af-872b-2aed04c99bbc-config-data\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.669263 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55c9b758c6-tbswb"] Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.670765 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.670861 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-config-data-custom\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.677163 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e922aa-fb79-405f-afd2-dc07a0bc8809-config-data\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.677739 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmpzj\" (UniqueName: \"kubernetes.io/projected/58a03d3a-03c2-47af-872b-2aed04c99bbc-kube-api-access-qmpzj\") pod \"barbican-keystone-listener-7df795b6b4-6qb9j\" (UID: \"58a03d3a-03c2-47af-872b-2aed04c99bbc\") " pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.683941 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55c9b758c6-tbswb"] Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.688284 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.691412 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trnjf\" (UniqueName: \"kubernetes.io/projected/06e922aa-fb79-405f-afd2-dc07a0bc8809-kube-api-access-trnjf\") pod \"barbican-worker-6d594f4d55-cjsm4\" (UID: \"06e922aa-fb79-405f-afd2-dc07a0bc8809\") " pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751162 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-config-data\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751221 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-dns-svc\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751243 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-combined-ca-bundle\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751262 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25bv4\" (UniqueName: \"kubernetes.io/projected/a6470a1d-33f9-4895-b6af-f797aedf568e-kube-api-access-25bv4\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751296 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-config-data-custom\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751328 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-sb\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751359 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmldf\" (UniqueName: \"kubernetes.io/projected/185bafa0-88c8-4439-b600-e6c906face05-kube-api-access-bmldf\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751389 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6470a1d-33f9-4895-b6af-f797aedf568e-logs\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751407 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-nb\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.751430 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-config\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.752642 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-config\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.752690 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-dns-svc\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.753380 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-sb\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.753652 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-nb\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.767661 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmldf\" (UniqueName: \"kubernetes.io/projected/185bafa0-88c8-4439-b600-e6c906face05-kube-api-access-bmldf\") pod \"dnsmasq-dns-64f998756c-btfjj\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.788637 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d594f4d55-cjsm4" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.807557 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.852880 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-config-data-custom\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.853009 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6470a1d-33f9-4895-b6af-f797aedf568e-logs\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.853061 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-config-data\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.853103 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-combined-ca-bundle\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.853124 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25bv4\" (UniqueName: \"kubernetes.io/projected/a6470a1d-33f9-4895-b6af-f797aedf568e-kube-api-access-25bv4\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.857013 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-config-data-custom\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.857309 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6470a1d-33f9-4895-b6af-f797aedf568e-logs\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.866037 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-combined-ca-bundle\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.866491 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6470a1d-33f9-4895-b6af-f797aedf568e-config-data\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.880449 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25bv4\" (UniqueName: \"kubernetes.io/projected/a6470a1d-33f9-4895-b6af-f797aedf568e-kube-api-access-25bv4\") pod \"barbican-api-55c9b758c6-tbswb\" (UID: \"a6470a1d-33f9-4895-b6af-f797aedf568e\") " pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:21 crc kubenswrapper[4789]: I1216 08:14:21.893363 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:22 crc kubenswrapper[4789]: I1216 08:14:22.038324 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:22 crc kubenswrapper[4789]: I1216 08:14:22.345742 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d594f4d55-cjsm4"] Dec 16 08:14:22 crc kubenswrapper[4789]: I1216 08:14:22.402749 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7df795b6b4-6qb9j"] Dec 16 08:14:22 crc kubenswrapper[4789]: W1216 08:14:22.406927 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58a03d3a_03c2_47af_872b_2aed04c99bbc.slice/crio-cb396342f6bf363986980abc19e6b51b746ad50caac0dc336f56429b55eadfff WatchSource:0}: Error finding container cb396342f6bf363986980abc19e6b51b746ad50caac0dc336f56429b55eadfff: Status 404 returned error can't find the container with id cb396342f6bf363986980abc19e6b51b746ad50caac0dc336f56429b55eadfff Dec 16 08:14:22 crc kubenswrapper[4789]: I1216 08:14:22.460699 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64f998756c-btfjj"] Dec 16 08:14:22 crc kubenswrapper[4789]: W1216 08:14:22.462382 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod185bafa0_88c8_4439_b600_e6c906face05.slice/crio-4f4b82069c9f3270cd68c25a7efbf13b307091ea86dd6c1a954b4b563276f38c WatchSource:0}: Error finding container 4f4b82069c9f3270cd68c25a7efbf13b307091ea86dd6c1a954b4b563276f38c: Status 404 returned error can't find the container with id 4f4b82069c9f3270cd68c25a7efbf13b307091ea86dd6c1a954b4b563276f38c Dec 16 08:14:22 crc kubenswrapper[4789]: I1216 08:14:22.549222 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55c9b758c6-tbswb"] Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.236138 4789 generic.go:334] "Generic (PLEG): container finished" podID="185bafa0-88c8-4439-b600-e6c906face05" containerID="108a401501389686f042ae7c47c8838cf6de7813a6a1b5e1b7100187068ae400" exitCode=0 Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.236233 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f998756c-btfjj" event={"ID":"185bafa0-88c8-4439-b600-e6c906face05","Type":"ContainerDied","Data":"108a401501389686f042ae7c47c8838cf6de7813a6a1b5e1b7100187068ae400"} Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.236543 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f998756c-btfjj" event={"ID":"185bafa0-88c8-4439-b600-e6c906face05","Type":"ContainerStarted","Data":"4f4b82069c9f3270cd68c25a7efbf13b307091ea86dd6c1a954b4b563276f38c"} Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.242258 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9b758c6-tbswb" event={"ID":"a6470a1d-33f9-4895-b6af-f797aedf568e","Type":"ContainerStarted","Data":"2c10699a02067b64313357e489c63d1e4d9f0c9f2fb620e815a5a50939e1afc0"} Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.242301 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9b758c6-tbswb" event={"ID":"a6470a1d-33f9-4895-b6af-f797aedf568e","Type":"ContainerStarted","Data":"e4a7d2a813c28a4d38bcc8a5a8c5a30674ff6190c5d2b5fda38af1d62e0c9454"} Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.242314 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55c9b758c6-tbswb" event={"ID":"a6470a1d-33f9-4895-b6af-f797aedf568e","Type":"ContainerStarted","Data":"9fb70b040533a7f1afafde1bb3f5bcda7eaba6ceb13c4b915510a3419e9c7527"} Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.242351 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.242404 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.245157 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" event={"ID":"58a03d3a-03c2-47af-872b-2aed04c99bbc","Type":"ContainerStarted","Data":"cb396342f6bf363986980abc19e6b51b746ad50caac0dc336f56429b55eadfff"} Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.248135 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d594f4d55-cjsm4" event={"ID":"06e922aa-fb79-405f-afd2-dc07a0bc8809","Type":"ContainerStarted","Data":"0f38f8f48a627973553e0fdc3d472b44dac6986fd2f19fcca039274f027de5eb"} Dec 16 08:14:23 crc kubenswrapper[4789]: I1216 08:14:23.291700 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55c9b758c6-tbswb" podStartSLOduration=2.291676945 podStartE2EDuration="2.291676945s" podCreationTimestamp="2025-12-16 08:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:14:23.280595904 +0000 UTC m=+5001.542483553" watchObservedRunningTime="2025-12-16 08:14:23.291676945 +0000 UTC m=+5001.553564574" Dec 16 08:14:24 crc kubenswrapper[4789]: I1216 08:14:24.257629 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f998756c-btfjj" event={"ID":"185bafa0-88c8-4439-b600-e6c906face05","Type":"ContainerStarted","Data":"eef18103d1408ab494bf22771526a9bffe1cf486e7539782cd4982d78e492bbe"} Dec 16 08:14:24 crc kubenswrapper[4789]: I1216 08:14:24.258070 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:24 crc kubenswrapper[4789]: I1216 08:14:24.293099 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64f998756c-btfjj" podStartSLOduration=3.293077944 podStartE2EDuration="3.293077944s" podCreationTimestamp="2025-12-16 08:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:14:24.282010694 +0000 UTC m=+5002.543898323" watchObservedRunningTime="2025-12-16 08:14:24.293077944 +0000 UTC m=+5002.554965583" Dec 16 08:14:25 crc kubenswrapper[4789]: I1216 08:14:25.272489 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" event={"ID":"58a03d3a-03c2-47af-872b-2aed04c99bbc","Type":"ContainerStarted","Data":"151e3fb7a3e9a755d89a699c98f0149f040709d505b5888bd7a57e117661eb09"} Dec 16 08:14:25 crc kubenswrapper[4789]: I1216 08:14:25.272946 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" event={"ID":"58a03d3a-03c2-47af-872b-2aed04c99bbc","Type":"ContainerStarted","Data":"de7867ab5ce98843145332ef9c0b1847adc60f470758b67bbffb2e9c2bd7f597"} Dec 16 08:14:25 crc kubenswrapper[4789]: I1216 08:14:25.277486 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d594f4d55-cjsm4" event={"ID":"06e922aa-fb79-405f-afd2-dc07a0bc8809","Type":"ContainerStarted","Data":"2763499dfe2490a66a4006ec8864386b66d3dc5666a2a4f50bea529832df6218"} Dec 16 08:14:25 crc kubenswrapper[4789]: I1216 08:14:25.277527 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d594f4d55-cjsm4" event={"ID":"06e922aa-fb79-405f-afd2-dc07a0bc8809","Type":"ContainerStarted","Data":"69a268e39b3f254024d7f4091743c441998814268103a5d3d82831a68c9c14ad"} Dec 16 08:14:25 crc kubenswrapper[4789]: I1216 08:14:25.301079 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7df795b6b4-6qb9j" podStartSLOduration=2.47314223 podStartE2EDuration="4.301060305s" podCreationTimestamp="2025-12-16 08:14:21 +0000 UTC" firstStartedPulling="2025-12-16 08:14:22.409443497 +0000 UTC m=+5000.671331126" lastFinishedPulling="2025-12-16 08:14:24.237361562 +0000 UTC m=+5002.499249201" observedRunningTime="2025-12-16 08:14:25.296529824 +0000 UTC m=+5003.558417453" watchObservedRunningTime="2025-12-16 08:14:25.301060305 +0000 UTC m=+5003.562947934" Dec 16 08:14:28 crc kubenswrapper[4789]: I1216 08:14:28.485002 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:28 crc kubenswrapper[4789]: I1216 08:14:28.517021 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d594f4d55-cjsm4" podStartSLOduration=5.626600025 podStartE2EDuration="7.517001708s" podCreationTimestamp="2025-12-16 08:14:21 +0000 UTC" firstStartedPulling="2025-12-16 08:14:22.344827338 +0000 UTC m=+5000.606714967" lastFinishedPulling="2025-12-16 08:14:24.235229001 +0000 UTC m=+5002.497116650" observedRunningTime="2025-12-16 08:14:25.320163302 +0000 UTC m=+5003.582050931" watchObservedRunningTime="2025-12-16 08:14:28.517001708 +0000 UTC m=+5006.778889337" Dec 16 08:14:29 crc kubenswrapper[4789]: I1216 08:14:29.436428 4789 scope.go:117] "RemoveContainer" containerID="763907aa41f0a740f5fb126fe7a601bc456599e3461c9b622f13313a607321bf" Dec 16 08:14:29 crc kubenswrapper[4789]: I1216 08:14:29.468227 4789 scope.go:117] "RemoveContainer" containerID="d18e14e0d92a6aaf0abf0a6ea5ccfbe8c1be6464e2b2ebabbea19eb58653e0dd" Dec 16 08:14:29 crc kubenswrapper[4789]: I1216 08:14:29.506069 4789 scope.go:117] "RemoveContainer" containerID="4ff3fa5bc5ddd34edc97625376d9c1b6b912464bf54b886bdade75bcbac04294" Dec 16 08:14:29 crc kubenswrapper[4789]: I1216 08:14:29.556895 4789 scope.go:117] "RemoveContainer" containerID="566b20d72aade5853918dcad07c3f1c50e2aa0d0d6e8337487a50cfde3f0a6db" Dec 16 08:14:29 crc kubenswrapper[4789]: I1216 08:14:29.593217 4789 scope.go:117] "RemoveContainer" containerID="8923449c0a73f49cd36d602d2e8c9e491993983cbb5f00138067d76fe91236a7" Dec 16 08:14:29 crc kubenswrapper[4789]: I1216 08:14:29.626225 4789 scope.go:117] "RemoveContainer" containerID="d2f44614e4ebb285256bfd2e3ee9e246d59dfcb93529e43f87132f1ab65c7cd0" Dec 16 08:14:29 crc kubenswrapper[4789]: I1216 08:14:29.666307 4789 scope.go:117] "RemoveContainer" containerID="af3f62284f3a19d1e48fb5402a6202fda7a7120f9299520571784945a4590110" Dec 16 08:14:30 crc kubenswrapper[4789]: I1216 08:14:30.134620 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55c9b758c6-tbswb" Dec 16 08:14:31 crc kubenswrapper[4789]: I1216 08:14:31.895286 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:14:31 crc kubenswrapper[4789]: I1216 08:14:31.954548 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648dfd6c8f-bdbc6"] Dec 16 08:14:31 crc kubenswrapper[4789]: I1216 08:14:31.954818 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerName="dnsmasq-dns" containerID="cri-o://a4e80ba7f5a8006ed4e23610e0c2d8e874b589f56f62dd9713e976a0f6e9dfb2" gracePeriod=10 Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.357134 4789 generic.go:334] "Generic (PLEG): container finished" podID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerID="a4e80ba7f5a8006ed4e23610e0c2d8e874b589f56f62dd9713e976a0f6e9dfb2" exitCode=0 Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.357594 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" event={"ID":"9aa7ce20-5022-4daa-af09-1e3c111449c5","Type":"ContainerDied","Data":"a4e80ba7f5a8006ed4e23610e0c2d8e874b589f56f62dd9713e976a0f6e9dfb2"} Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.494092 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.667885 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-sb\") pod \"9aa7ce20-5022-4daa-af09-1e3c111449c5\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.667960 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-dns-svc\") pod \"9aa7ce20-5022-4daa-af09-1e3c111449c5\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.668018 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gfg9\" (UniqueName: \"kubernetes.io/projected/9aa7ce20-5022-4daa-af09-1e3c111449c5-kube-api-access-9gfg9\") pod \"9aa7ce20-5022-4daa-af09-1e3c111449c5\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.668040 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-config\") pod \"9aa7ce20-5022-4daa-af09-1e3c111449c5\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.668110 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-nb\") pod \"9aa7ce20-5022-4daa-af09-1e3c111449c5\" (UID: \"9aa7ce20-5022-4daa-af09-1e3c111449c5\") " Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.674132 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa7ce20-5022-4daa-af09-1e3c111449c5-kube-api-access-9gfg9" (OuterVolumeSpecName: "kube-api-access-9gfg9") pod "9aa7ce20-5022-4daa-af09-1e3c111449c5" (UID: "9aa7ce20-5022-4daa-af09-1e3c111449c5"). InnerVolumeSpecName "kube-api-access-9gfg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.721175 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aa7ce20-5022-4daa-af09-1e3c111449c5" (UID: "9aa7ce20-5022-4daa-af09-1e3c111449c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.736963 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-config" (OuterVolumeSpecName: "config") pod "9aa7ce20-5022-4daa-af09-1e3c111449c5" (UID: "9aa7ce20-5022-4daa-af09-1e3c111449c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.763234 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9aa7ce20-5022-4daa-af09-1e3c111449c5" (UID: "9aa7ce20-5022-4daa-af09-1e3c111449c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.769899 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.769992 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.770025 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gfg9\" (UniqueName: \"kubernetes.io/projected/9aa7ce20-5022-4daa-af09-1e3c111449c5-kube-api-access-9gfg9\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.770038 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.772626 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9aa7ce20-5022-4daa-af09-1e3c111449c5" (UID: "9aa7ce20-5022-4daa-af09-1e3c111449c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:32 crc kubenswrapper[4789]: I1216 08:14:32.871130 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa7ce20-5022-4daa-af09-1e3c111449c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:33 crc kubenswrapper[4789]: I1216 08:14:33.367121 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" event={"ID":"9aa7ce20-5022-4daa-af09-1e3c111449c5","Type":"ContainerDied","Data":"4edd4d711a8f7f87de33a11e84533850b2a19d931debc48d6ca904f7e81e8c2e"} Dec 16 08:14:33 crc kubenswrapper[4789]: I1216 08:14:33.367182 4789 scope.go:117] "RemoveContainer" containerID="a4e80ba7f5a8006ed4e23610e0c2d8e874b589f56f62dd9713e976a0f6e9dfb2" Dec 16 08:14:33 crc kubenswrapper[4789]: I1216 08:14:33.367245 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" Dec 16 08:14:33 crc kubenswrapper[4789]: I1216 08:14:33.402825 4789 scope.go:117] "RemoveContainer" containerID="69df9522049cf9191373a4f1f282abcc668bf6b2f8559c3ba89e14bad7483b92" Dec 16 08:14:33 crc kubenswrapper[4789]: I1216 08:14:33.435969 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648dfd6c8f-bdbc6"] Dec 16 08:14:33 crc kubenswrapper[4789]: I1216 08:14:33.446752 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-648dfd6c8f-bdbc6"] Dec 16 08:14:34 crc kubenswrapper[4789]: I1216 08:14:34.119699 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" path="/var/lib/kubelet/pods/9aa7ce20-5022-4daa-af09-1e3c111449c5/volumes" Dec 16 08:14:37 crc kubenswrapper[4789]: I1216 08:14:37.260171 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-648dfd6c8f-bdbc6" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.19:5353: i/o timeout" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.125940 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7c8qf"] Dec 16 08:14:42 crc kubenswrapper[4789]: E1216 08:14:42.126980 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerName="dnsmasq-dns" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.127001 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerName="dnsmasq-dns" Dec 16 08:14:42 crc kubenswrapper[4789]: E1216 08:14:42.127020 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerName="init" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.127027 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerName="init" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.127204 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa7ce20-5022-4daa-af09-1e3c111449c5" containerName="dnsmasq-dns" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.127952 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.135151 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7c8qf"] Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.234973 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3b71-account-create-update-6shm9"] Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.235946 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.238188 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.242689 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3b71-account-create-update-6shm9"] Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.248828 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb9s2\" (UniqueName: \"kubernetes.io/projected/a2263c18-9034-41a3-a1c1-9833bda12fa3-kube-api-access-wb9s2\") pod \"neutron-db-create-7c8qf\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.248882 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2263c18-9034-41a3-a1c1-9833bda12fa3-operator-scripts\") pod \"neutron-db-create-7c8qf\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.350969 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e847b977-90db-4d0a-91eb-f38fa7cd9035-operator-scripts\") pod \"neutron-3b71-account-create-update-6shm9\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.351100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9s2\" (UniqueName: \"kubernetes.io/projected/a2263c18-9034-41a3-a1c1-9833bda12fa3-kube-api-access-wb9s2\") pod \"neutron-db-create-7c8qf\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.351135 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcght\" (UniqueName: \"kubernetes.io/projected/e847b977-90db-4d0a-91eb-f38fa7cd9035-kube-api-access-gcght\") pod \"neutron-3b71-account-create-update-6shm9\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.351161 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2263c18-9034-41a3-a1c1-9833bda12fa3-operator-scripts\") pod \"neutron-db-create-7c8qf\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.351760 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2263c18-9034-41a3-a1c1-9833bda12fa3-operator-scripts\") pod \"neutron-db-create-7c8qf\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.369971 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb9s2\" (UniqueName: \"kubernetes.io/projected/a2263c18-9034-41a3-a1c1-9833bda12fa3-kube-api-access-wb9s2\") pod \"neutron-db-create-7c8qf\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.451461 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.453104 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcght\" (UniqueName: \"kubernetes.io/projected/e847b977-90db-4d0a-91eb-f38fa7cd9035-kube-api-access-gcght\") pod \"neutron-3b71-account-create-update-6shm9\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.453232 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e847b977-90db-4d0a-91eb-f38fa7cd9035-operator-scripts\") pod \"neutron-3b71-account-create-update-6shm9\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.454160 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e847b977-90db-4d0a-91eb-f38fa7cd9035-operator-scripts\") pod \"neutron-3b71-account-create-update-6shm9\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.470804 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcght\" (UniqueName: \"kubernetes.io/projected/e847b977-90db-4d0a-91eb-f38fa7cd9035-kube-api-access-gcght\") pod \"neutron-3b71-account-create-update-6shm9\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.554404 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:42 crc kubenswrapper[4789]: I1216 08:14:42.879714 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7c8qf"] Dec 16 08:14:42 crc kubenswrapper[4789]: W1216 08:14:42.884337 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2263c18_9034_41a3_a1c1_9833bda12fa3.slice/crio-7ff2c152d838a49d6972bf9bcf6c54238512d2ddef9c79066ee99935b17559d2 WatchSource:0}: Error finding container 7ff2c152d838a49d6972bf9bcf6c54238512d2ddef9c79066ee99935b17559d2: Status 404 returned error can't find the container with id 7ff2c152d838a49d6972bf9bcf6c54238512d2ddef9c79066ee99935b17559d2 Dec 16 08:14:43 crc kubenswrapper[4789]: I1216 08:14:43.077588 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3b71-account-create-update-6shm9"] Dec 16 08:14:43 crc kubenswrapper[4789]: W1216 08:14:43.080818 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode847b977_90db_4d0a_91eb_f38fa7cd9035.slice/crio-d1a3b198e12bc0218dec3770d2f9599a551892f1f1efa2b4c256986b0ad3320b WatchSource:0}: Error finding container d1a3b198e12bc0218dec3770d2f9599a551892f1f1efa2b4c256986b0ad3320b: Status 404 returned error can't find the container with id d1a3b198e12bc0218dec3770d2f9599a551892f1f1efa2b4c256986b0ad3320b Dec 16 08:14:43 crc kubenswrapper[4789]: I1216 08:14:43.442821 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b71-account-create-update-6shm9" event={"ID":"e847b977-90db-4d0a-91eb-f38fa7cd9035","Type":"ContainerStarted","Data":"3d9e045cc89528945529db22f7098c0ff197a08d8af5ac7f5939c90ad9c9cb81"} Dec 16 08:14:43 crc kubenswrapper[4789]: I1216 08:14:43.443198 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b71-account-create-update-6shm9" event={"ID":"e847b977-90db-4d0a-91eb-f38fa7cd9035","Type":"ContainerStarted","Data":"d1a3b198e12bc0218dec3770d2f9599a551892f1f1efa2b4c256986b0ad3320b"} Dec 16 08:14:43 crc kubenswrapper[4789]: I1216 08:14:43.444553 4789 generic.go:334] "Generic (PLEG): container finished" podID="a2263c18-9034-41a3-a1c1-9833bda12fa3" containerID="3a8413362262125d355800eb63ed82a1221e13ea56e7e186242c98278b80f337" exitCode=0 Dec 16 08:14:43 crc kubenswrapper[4789]: I1216 08:14:43.444585 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7c8qf" event={"ID":"a2263c18-9034-41a3-a1c1-9833bda12fa3","Type":"ContainerDied","Data":"3a8413362262125d355800eb63ed82a1221e13ea56e7e186242c98278b80f337"} Dec 16 08:14:43 crc kubenswrapper[4789]: I1216 08:14:43.444603 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7c8qf" event={"ID":"a2263c18-9034-41a3-a1c1-9833bda12fa3","Type":"ContainerStarted","Data":"7ff2c152d838a49d6972bf9bcf6c54238512d2ddef9c79066ee99935b17559d2"} Dec 16 08:14:43 crc kubenswrapper[4789]: I1216 08:14:43.460461 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-3b71-account-create-update-6shm9" podStartSLOduration=1.460444597 podStartE2EDuration="1.460444597s" podCreationTimestamp="2025-12-16 08:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:14:43.456143602 +0000 UTC m=+5021.718031241" watchObservedRunningTime="2025-12-16 08:14:43.460444597 +0000 UTC m=+5021.722332226" Dec 16 08:14:44 crc kubenswrapper[4789]: I1216 08:14:44.453395 4789 generic.go:334] "Generic (PLEG): container finished" podID="e847b977-90db-4d0a-91eb-f38fa7cd9035" containerID="3d9e045cc89528945529db22f7098c0ff197a08d8af5ac7f5939c90ad9c9cb81" exitCode=0 Dec 16 08:14:44 crc kubenswrapper[4789]: I1216 08:14:44.453444 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b71-account-create-update-6shm9" event={"ID":"e847b977-90db-4d0a-91eb-f38fa7cd9035","Type":"ContainerDied","Data":"3d9e045cc89528945529db22f7098c0ff197a08d8af5ac7f5939c90ad9c9cb81"} Dec 16 08:14:44 crc kubenswrapper[4789]: I1216 08:14:44.800009 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:44 crc kubenswrapper[4789]: I1216 08:14:44.898388 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2263c18-9034-41a3-a1c1-9833bda12fa3-operator-scripts\") pod \"a2263c18-9034-41a3-a1c1-9833bda12fa3\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " Dec 16 08:14:44 crc kubenswrapper[4789]: I1216 08:14:44.898489 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb9s2\" (UniqueName: \"kubernetes.io/projected/a2263c18-9034-41a3-a1c1-9833bda12fa3-kube-api-access-wb9s2\") pod \"a2263c18-9034-41a3-a1c1-9833bda12fa3\" (UID: \"a2263c18-9034-41a3-a1c1-9833bda12fa3\") " Dec 16 08:14:44 crc kubenswrapper[4789]: I1216 08:14:44.899068 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2263c18-9034-41a3-a1c1-9833bda12fa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2263c18-9034-41a3-a1c1-9833bda12fa3" (UID: "a2263c18-9034-41a3-a1c1-9833bda12fa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:44 crc kubenswrapper[4789]: I1216 08:14:44.903638 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2263c18-9034-41a3-a1c1-9833bda12fa3-kube-api-access-wb9s2" (OuterVolumeSpecName: "kube-api-access-wb9s2") pod "a2263c18-9034-41a3-a1c1-9833bda12fa3" (UID: "a2263c18-9034-41a3-a1c1-9833bda12fa3"). InnerVolumeSpecName "kube-api-access-wb9s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:14:45 crc kubenswrapper[4789]: I1216 08:14:45.000641 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb9s2\" (UniqueName: \"kubernetes.io/projected/a2263c18-9034-41a3-a1c1-9833bda12fa3-kube-api-access-wb9s2\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:45 crc kubenswrapper[4789]: I1216 08:14:45.000679 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2263c18-9034-41a3-a1c1-9833bda12fa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:45 crc kubenswrapper[4789]: I1216 08:14:45.462226 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7c8qf" Dec 16 08:14:45 crc kubenswrapper[4789]: I1216 08:14:45.462255 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7c8qf" event={"ID":"a2263c18-9034-41a3-a1c1-9833bda12fa3","Type":"ContainerDied","Data":"7ff2c152d838a49d6972bf9bcf6c54238512d2ddef9c79066ee99935b17559d2"} Dec 16 08:14:45 crc kubenswrapper[4789]: I1216 08:14:45.462285 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ff2c152d838a49d6972bf9bcf6c54238512d2ddef9c79066ee99935b17559d2" Dec 16 08:14:45 crc kubenswrapper[4789]: I1216 08:14:45.923059 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.019158 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e847b977-90db-4d0a-91eb-f38fa7cd9035-operator-scripts\") pod \"e847b977-90db-4d0a-91eb-f38fa7cd9035\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.019796 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcght\" (UniqueName: \"kubernetes.io/projected/e847b977-90db-4d0a-91eb-f38fa7cd9035-kube-api-access-gcght\") pod \"e847b977-90db-4d0a-91eb-f38fa7cd9035\" (UID: \"e847b977-90db-4d0a-91eb-f38fa7cd9035\") " Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.020239 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e847b977-90db-4d0a-91eb-f38fa7cd9035-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e847b977-90db-4d0a-91eb-f38fa7cd9035" (UID: "e847b977-90db-4d0a-91eb-f38fa7cd9035"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.020625 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e847b977-90db-4d0a-91eb-f38fa7cd9035-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.026259 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e847b977-90db-4d0a-91eb-f38fa7cd9035-kube-api-access-gcght" (OuterVolumeSpecName: "kube-api-access-gcght") pod "e847b977-90db-4d0a-91eb-f38fa7cd9035" (UID: "e847b977-90db-4d0a-91eb-f38fa7cd9035"). InnerVolumeSpecName "kube-api-access-gcght". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.122120 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcght\" (UniqueName: \"kubernetes.io/projected/e847b977-90db-4d0a-91eb-f38fa7cd9035-kube-api-access-gcght\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.474392 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3b71-account-create-update-6shm9" event={"ID":"e847b977-90db-4d0a-91eb-f38fa7cd9035","Type":"ContainerDied","Data":"d1a3b198e12bc0218dec3770d2f9599a551892f1f1efa2b4c256986b0ad3320b"} Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.474441 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3b71-account-create-update-6shm9" Dec 16 08:14:46 crc kubenswrapper[4789]: I1216 08:14:46.474446 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a3b198e12bc0218dec3770d2f9599a551892f1f1efa2b4c256986b0ad3320b" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.512018 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zfv7d"] Dec 16 08:14:47 crc kubenswrapper[4789]: E1216 08:14:47.512884 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e847b977-90db-4d0a-91eb-f38fa7cd9035" containerName="mariadb-account-create-update" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.512905 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e847b977-90db-4d0a-91eb-f38fa7cd9035" containerName="mariadb-account-create-update" Dec 16 08:14:47 crc kubenswrapper[4789]: E1216 08:14:47.513150 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2263c18-9034-41a3-a1c1-9833bda12fa3" containerName="mariadb-database-create" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.513184 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2263c18-9034-41a3-a1c1-9833bda12fa3" containerName="mariadb-database-create" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.513462 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e847b977-90db-4d0a-91eb-f38fa7cd9035" containerName="mariadb-account-create-update" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.513504 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2263c18-9034-41a3-a1c1-9833bda12fa3" containerName="mariadb-database-create" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.514336 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.518781 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.519049 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.519295 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zcsqz" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.521621 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zfv7d"] Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.647409 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lhrh\" (UniqueName: \"kubernetes.io/projected/cca73e6e-b52b-44af-821f-96e2e7be8bf3-kube-api-access-2lhrh\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.647550 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-config\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.647588 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-combined-ca-bundle\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.749171 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lhrh\" (UniqueName: \"kubernetes.io/projected/cca73e6e-b52b-44af-821f-96e2e7be8bf3-kube-api-access-2lhrh\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.749295 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-config\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.749325 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-combined-ca-bundle\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.754636 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-config\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.756226 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-combined-ca-bundle\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.765777 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lhrh\" (UniqueName: \"kubernetes.io/projected/cca73e6e-b52b-44af-821f-96e2e7be8bf3-kube-api-access-2lhrh\") pod \"neutron-db-sync-zfv7d\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:47 crc kubenswrapper[4789]: I1216 08:14:47.834412 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:48 crc kubenswrapper[4789]: I1216 08:14:48.258409 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zfv7d"] Dec 16 08:14:48 crc kubenswrapper[4789]: I1216 08:14:48.491319 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zfv7d" event={"ID":"cca73e6e-b52b-44af-821f-96e2e7be8bf3","Type":"ContainerStarted","Data":"c40445bbd449bb5bf768a739e15d6f32b2ba40352aca9ad4001d710bb16d86b8"} Dec 16 08:14:48 crc kubenswrapper[4789]: I1216 08:14:48.491635 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zfv7d" event={"ID":"cca73e6e-b52b-44af-821f-96e2e7be8bf3","Type":"ContainerStarted","Data":"018b2b3b9afe36c70a20ca033f4c4c863847595a309c4483162875242cd9d33f"} Dec 16 08:14:48 crc kubenswrapper[4789]: I1216 08:14:48.510289 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zfv7d" podStartSLOduration=1.510273501 podStartE2EDuration="1.510273501s" podCreationTimestamp="2025-12-16 08:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:14:48.504857809 +0000 UTC m=+5026.766745448" watchObservedRunningTime="2025-12-16 08:14:48.510273501 +0000 UTC m=+5026.772161140" Dec 16 08:14:51 crc kubenswrapper[4789]: I1216 08:14:51.928078 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:14:51 crc kubenswrapper[4789]: I1216 08:14:51.928441 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:14:53 crc kubenswrapper[4789]: I1216 08:14:53.530446 4789 generic.go:334] "Generic (PLEG): container finished" podID="cca73e6e-b52b-44af-821f-96e2e7be8bf3" containerID="c40445bbd449bb5bf768a739e15d6f32b2ba40352aca9ad4001d710bb16d86b8" exitCode=0 Dec 16 08:14:53 crc kubenswrapper[4789]: I1216 08:14:53.530582 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zfv7d" event={"ID":"cca73e6e-b52b-44af-821f-96e2e7be8bf3","Type":"ContainerDied","Data":"c40445bbd449bb5bf768a739e15d6f32b2ba40352aca9ad4001d710bb16d86b8"} Dec 16 08:14:54 crc kubenswrapper[4789]: I1216 08:14:54.978036 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.087934 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lhrh\" (UniqueName: \"kubernetes.io/projected/cca73e6e-b52b-44af-821f-96e2e7be8bf3-kube-api-access-2lhrh\") pod \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.088018 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-config\") pod \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.088136 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-combined-ca-bundle\") pod \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\" (UID: \"cca73e6e-b52b-44af-821f-96e2e7be8bf3\") " Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.100463 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca73e6e-b52b-44af-821f-96e2e7be8bf3-kube-api-access-2lhrh" (OuterVolumeSpecName: "kube-api-access-2lhrh") pod "cca73e6e-b52b-44af-821f-96e2e7be8bf3" (UID: "cca73e6e-b52b-44af-821f-96e2e7be8bf3"). InnerVolumeSpecName "kube-api-access-2lhrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.118515 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-config" (OuterVolumeSpecName: "config") pod "cca73e6e-b52b-44af-821f-96e2e7be8bf3" (UID: "cca73e6e-b52b-44af-821f-96e2e7be8bf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.125656 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cca73e6e-b52b-44af-821f-96e2e7be8bf3" (UID: "cca73e6e-b52b-44af-821f-96e2e7be8bf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.190632 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lhrh\" (UniqueName: \"kubernetes.io/projected/cca73e6e-b52b-44af-821f-96e2e7be8bf3-kube-api-access-2lhrh\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.192062 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.192098 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca73e6e-b52b-44af-821f-96e2e7be8bf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.609195 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zfv7d" event={"ID":"cca73e6e-b52b-44af-821f-96e2e7be8bf3","Type":"ContainerDied","Data":"018b2b3b9afe36c70a20ca033f4c4c863847595a309c4483162875242cd9d33f"} Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.609252 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="018b2b3b9afe36c70a20ca033f4c4c863847595a309c4483162875242cd9d33f" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.609329 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zfv7d" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.956349 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58d7b6d9-vfrgx"] Dec 16 08:14:55 crc kubenswrapper[4789]: E1216 08:14:55.956742 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca73e6e-b52b-44af-821f-96e2e7be8bf3" containerName="neutron-db-sync" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.956767 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca73e6e-b52b-44af-821f-96e2e7be8bf3" containerName="neutron-db-sync" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.956992 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca73e6e-b52b-44af-821f-96e2e7be8bf3" containerName="neutron-db-sync" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.958011 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:55 crc kubenswrapper[4789]: I1216 08:14:55.963805 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58d7b6d9-vfrgx"] Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.024221 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d9f7fc5b5-p8zdl"] Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.025970 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.028810 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.029086 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.029325 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zcsqz" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.050756 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d9f7fc5b5-p8zdl"] Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.102672 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-config\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.102754 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-nb\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.102797 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-sb\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.102843 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56vk5\" (UniqueName: \"kubernetes.io/projected/272bbed9-b63b-4ecc-b85f-434030f27a80-kube-api-access-56vk5\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.102872 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-dns-svc\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.204841 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-sb\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.204926 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56vk5\" (UniqueName: \"kubernetes.io/projected/272bbed9-b63b-4ecc-b85f-434030f27a80-kube-api-access-56vk5\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.204958 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-config\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.204981 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-dns-svc\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.205013 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q5ht\" (UniqueName: \"kubernetes.io/projected/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-kube-api-access-7q5ht\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.205078 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-combined-ca-bundle\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.205100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-config\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.205132 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-httpd-config\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.205154 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-nb\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.206832 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-sb\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.207847 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-dns-svc\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.208530 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-config\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.209226 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-nb\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.226672 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56vk5\" (UniqueName: \"kubernetes.io/projected/272bbed9-b63b-4ecc-b85f-434030f27a80-kube-api-access-56vk5\") pod \"dnsmasq-dns-58d7b6d9-vfrgx\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.281580 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.306835 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-combined-ca-bundle\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.306899 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-httpd-config\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.307695 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-config\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.307741 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5ht\" (UniqueName: \"kubernetes.io/projected/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-kube-api-access-7q5ht\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.311739 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-config\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.314259 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-combined-ca-bundle\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.315682 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-httpd-config\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.334640 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q5ht\" (UniqueName: \"kubernetes.io/projected/7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07-kube-api-access-7q5ht\") pod \"neutron-5d9f7fc5b5-p8zdl\" (UID: \"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07\") " pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.356422 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:56 crc kubenswrapper[4789]: I1216 08:14:56.850531 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58d7b6d9-vfrgx"] Dec 16 08:14:56 crc kubenswrapper[4789]: W1216 08:14:56.879633 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272bbed9_b63b_4ecc_b85f_434030f27a80.slice/crio-241d44117bec03976c0371a1ceefe430a4819ba4a9837c42bee00834f85659a5 WatchSource:0}: Error finding container 241d44117bec03976c0371a1ceefe430a4819ba4a9837c42bee00834f85659a5: Status 404 returned error can't find the container with id 241d44117bec03976c0371a1ceefe430a4819ba4a9837c42bee00834f85659a5 Dec 16 08:14:57 crc kubenswrapper[4789]: I1216 08:14:57.043052 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d9f7fc5b5-p8zdl"] Dec 16 08:14:57 crc kubenswrapper[4789]: W1216 08:14:57.053273 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bf8b381_0e40_4ec0_9cd2_9b5752ec9e07.slice/crio-c83adeb5fc1065c98b5408b8028a842e333d38dd1d439f0c3f52556352f3fe1f WatchSource:0}: Error finding container c83adeb5fc1065c98b5408b8028a842e333d38dd1d439f0c3f52556352f3fe1f: Status 404 returned error can't find the container with id c83adeb5fc1065c98b5408b8028a842e333d38dd1d439f0c3f52556352f3fe1f Dec 16 08:14:57 crc kubenswrapper[4789]: I1216 08:14:57.646220 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9f7fc5b5-p8zdl" event={"ID":"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07","Type":"ContainerStarted","Data":"c83adeb5fc1065c98b5408b8028a842e333d38dd1d439f0c3f52556352f3fe1f"} Dec 16 08:14:57 crc kubenswrapper[4789]: I1216 08:14:57.648237 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" event={"ID":"272bbed9-b63b-4ecc-b85f-434030f27a80","Type":"ContainerStarted","Data":"241d44117bec03976c0371a1ceefe430a4819ba4a9837c42bee00834f85659a5"} Dec 16 08:14:58 crc kubenswrapper[4789]: I1216 08:14:58.656174 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9f7fc5b5-p8zdl" event={"ID":"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07","Type":"ContainerStarted","Data":"4e7cada62eb1d322fb1f0ed0d512034d1deaac5807fc4b92ffd67f2fe28c2a9e"} Dec 16 08:14:58 crc kubenswrapper[4789]: I1216 08:14:58.656712 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:14:58 crc kubenswrapper[4789]: I1216 08:14:58.656733 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9f7fc5b5-p8zdl" event={"ID":"7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07","Type":"ContainerStarted","Data":"1d01efb022c0d84e7ff4cb27613186e5b01ee2e8b25f8bd06ea9969febd59764"} Dec 16 08:14:58 crc kubenswrapper[4789]: I1216 08:14:58.657488 4789 generic.go:334] "Generic (PLEG): container finished" podID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerID="8cfb5362fa2c50f67ba70bde43c016e5b44e4f04b0df6b374ce243a69a2214b9" exitCode=0 Dec 16 08:14:58 crc kubenswrapper[4789]: I1216 08:14:58.657521 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" event={"ID":"272bbed9-b63b-4ecc-b85f-434030f27a80","Type":"ContainerDied","Data":"8cfb5362fa2c50f67ba70bde43c016e5b44e4f04b0df6b374ce243a69a2214b9"} Dec 16 08:14:58 crc kubenswrapper[4789]: I1216 08:14:58.684072 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d9f7fc5b5-p8zdl" podStartSLOduration=3.684049563 podStartE2EDuration="3.684049563s" podCreationTimestamp="2025-12-16 08:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:14:58.674496029 +0000 UTC m=+5036.936383668" watchObservedRunningTime="2025-12-16 08:14:58.684049563 +0000 UTC m=+5036.945937202" Dec 16 08:14:59 crc kubenswrapper[4789]: I1216 08:14:59.665885 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" event={"ID":"272bbed9-b63b-4ecc-b85f-434030f27a80","Type":"ContainerStarted","Data":"64ddee54f571d1f9d6e1058c3b9ce8d332686923054b3faed4f534e6a8f5d59c"} Dec 16 08:14:59 crc kubenswrapper[4789]: I1216 08:14:59.684401 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" podStartSLOduration=4.684378176 podStartE2EDuration="4.684378176s" podCreationTimestamp="2025-12-16 08:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:14:59.681667369 +0000 UTC m=+5037.943554998" watchObservedRunningTime="2025-12-16 08:14:59.684378176 +0000 UTC m=+5037.946265805" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.151732 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl"] Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.152977 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl"] Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.153073 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.155351 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.167508 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.293041 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d87b09-4747-4165-b443-e19c0dfbbec8-config-volume\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.293382 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d87b09-4747-4165-b443-e19c0dfbbec8-secret-volume\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.293433 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkf4r\" (UniqueName: \"kubernetes.io/projected/00d87b09-4747-4165-b443-e19c0dfbbec8-kube-api-access-nkf4r\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.395652 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d87b09-4747-4165-b443-e19c0dfbbec8-secret-volume\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.395725 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkf4r\" (UniqueName: \"kubernetes.io/projected/00d87b09-4747-4165-b443-e19c0dfbbec8-kube-api-access-nkf4r\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.395815 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d87b09-4747-4165-b443-e19c0dfbbec8-config-volume\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.402758 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d87b09-4747-4165-b443-e19c0dfbbec8-secret-volume\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.412653 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d87b09-4747-4165-b443-e19c0dfbbec8-config-volume\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.418932 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkf4r\" (UniqueName: \"kubernetes.io/projected/00d87b09-4747-4165-b443-e19c0dfbbec8-kube-api-access-nkf4r\") pod \"collect-profiles-29431215-hlbrl\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.482520 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.673178 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:15:00 crc kubenswrapper[4789]: I1216 08:15:00.917263 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl"] Dec 16 08:15:00 crc kubenswrapper[4789]: W1216 08:15:00.919502 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d87b09_4747_4165_b443_e19c0dfbbec8.slice/crio-3c2b381069b318fd1a4ab81aba718b692ba337b97a72fa33a6ab91e3152b23d3 WatchSource:0}: Error finding container 3c2b381069b318fd1a4ab81aba718b692ba337b97a72fa33a6ab91e3152b23d3: Status 404 returned error can't find the container with id 3c2b381069b318fd1a4ab81aba718b692ba337b97a72fa33a6ab91e3152b23d3 Dec 16 08:15:01 crc kubenswrapper[4789]: I1216 08:15:01.681075 4789 generic.go:334] "Generic (PLEG): container finished" podID="00d87b09-4747-4165-b443-e19c0dfbbec8" containerID="4f25896a9c7656d4842b0c5701e84dca548fe0d82e5d6e47a792b8c468f56803" exitCode=0 Dec 16 08:15:01 crc kubenswrapper[4789]: I1216 08:15:01.681297 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" event={"ID":"00d87b09-4747-4165-b443-e19c0dfbbec8","Type":"ContainerDied","Data":"4f25896a9c7656d4842b0c5701e84dca548fe0d82e5d6e47a792b8c468f56803"} Dec 16 08:15:01 crc kubenswrapper[4789]: I1216 08:15:01.682026 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" event={"ID":"00d87b09-4747-4165-b443-e19c0dfbbec8","Type":"ContainerStarted","Data":"3c2b381069b318fd1a4ab81aba718b692ba337b97a72fa33a6ab91e3152b23d3"} Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.042973 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.242586 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d87b09-4747-4165-b443-e19c0dfbbec8-config-volume\") pod \"00d87b09-4747-4165-b443-e19c0dfbbec8\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.242760 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d87b09-4747-4165-b443-e19c0dfbbec8-secret-volume\") pod \"00d87b09-4747-4165-b443-e19c0dfbbec8\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.242846 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkf4r\" (UniqueName: \"kubernetes.io/projected/00d87b09-4747-4165-b443-e19c0dfbbec8-kube-api-access-nkf4r\") pod \"00d87b09-4747-4165-b443-e19c0dfbbec8\" (UID: \"00d87b09-4747-4165-b443-e19c0dfbbec8\") " Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.246200 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d87b09-4747-4165-b443-e19c0dfbbec8-config-volume" (OuterVolumeSpecName: "config-volume") pod "00d87b09-4747-4165-b443-e19c0dfbbec8" (UID: "00d87b09-4747-4165-b443-e19c0dfbbec8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.257215 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d87b09-4747-4165-b443-e19c0dfbbec8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "00d87b09-4747-4165-b443-e19c0dfbbec8" (UID: "00d87b09-4747-4165-b443-e19c0dfbbec8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.267248 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d87b09-4747-4165-b443-e19c0dfbbec8-kube-api-access-nkf4r" (OuterVolumeSpecName: "kube-api-access-nkf4r") pod "00d87b09-4747-4165-b443-e19c0dfbbec8" (UID: "00d87b09-4747-4165-b443-e19c0dfbbec8"). InnerVolumeSpecName "kube-api-access-nkf4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.345522 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00d87b09-4747-4165-b443-e19c0dfbbec8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.345881 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkf4r\" (UniqueName: \"kubernetes.io/projected/00d87b09-4747-4165-b443-e19c0dfbbec8-kube-api-access-nkf4r\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.345956 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00d87b09-4747-4165-b443-e19c0dfbbec8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.701571 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" event={"ID":"00d87b09-4747-4165-b443-e19c0dfbbec8","Type":"ContainerDied","Data":"3c2b381069b318fd1a4ab81aba718b692ba337b97a72fa33a6ab91e3152b23d3"} Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.701622 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl" Dec 16 08:15:03 crc kubenswrapper[4789]: I1216 08:15:03.701625 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c2b381069b318fd1a4ab81aba718b692ba337b97a72fa33a6ab91e3152b23d3" Dec 16 08:15:04 crc kubenswrapper[4789]: I1216 08:15:04.117106 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp"] Dec 16 08:15:04 crc kubenswrapper[4789]: I1216 08:15:04.123620 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431170-kshmp"] Dec 16 08:15:06 crc kubenswrapper[4789]: I1216 08:15:06.116218 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1079d4-ef8d-4d61-a1da-1616004f8c66" path="/var/lib/kubelet/pods/3e1079d4-ef8d-4d61-a1da-1616004f8c66/volumes" Dec 16 08:15:06 crc kubenswrapper[4789]: I1216 08:15:06.283087 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:15:06 crc kubenswrapper[4789]: I1216 08:15:06.354124 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f998756c-btfjj"] Dec 16 08:15:06 crc kubenswrapper[4789]: I1216 08:15:06.354357 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64f998756c-btfjj" podUID="185bafa0-88c8-4439-b600-e6c906face05" containerName="dnsmasq-dns" containerID="cri-o://eef18103d1408ab494bf22771526a9bffe1cf486e7539782cd4982d78e492bbe" gracePeriod=10 Dec 16 08:15:06 crc kubenswrapper[4789]: I1216 08:15:06.893929 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-64f998756c-btfjj" podUID="185bafa0-88c8-4439-b600-e6c906face05" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.30:5353: connect: connection refused" Dec 16 08:15:07 crc kubenswrapper[4789]: I1216 08:15:07.733519 4789 generic.go:334] "Generic (PLEG): container finished" podID="185bafa0-88c8-4439-b600-e6c906face05" containerID="eef18103d1408ab494bf22771526a9bffe1cf486e7539782cd4982d78e492bbe" exitCode=0 Dec 16 08:15:07 crc kubenswrapper[4789]: I1216 08:15:07.733571 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f998756c-btfjj" event={"ID":"185bafa0-88c8-4439-b600-e6c906face05","Type":"ContainerDied","Data":"eef18103d1408ab494bf22771526a9bffe1cf486e7539782cd4982d78e492bbe"} Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.123820 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.234561 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-dns-svc\") pod \"185bafa0-88c8-4439-b600-e6c906face05\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.234647 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmldf\" (UniqueName: \"kubernetes.io/projected/185bafa0-88c8-4439-b600-e6c906face05-kube-api-access-bmldf\") pod \"185bafa0-88c8-4439-b600-e6c906face05\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.234765 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-sb\") pod \"185bafa0-88c8-4439-b600-e6c906face05\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.234808 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-config\") pod \"185bafa0-88c8-4439-b600-e6c906face05\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.234956 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-nb\") pod \"185bafa0-88c8-4439-b600-e6c906face05\" (UID: \"185bafa0-88c8-4439-b600-e6c906face05\") " Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.245316 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185bafa0-88c8-4439-b600-e6c906face05-kube-api-access-bmldf" (OuterVolumeSpecName: "kube-api-access-bmldf") pod "185bafa0-88c8-4439-b600-e6c906face05" (UID: "185bafa0-88c8-4439-b600-e6c906face05"). InnerVolumeSpecName "kube-api-access-bmldf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.272201 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "185bafa0-88c8-4439-b600-e6c906face05" (UID: "185bafa0-88c8-4439-b600-e6c906face05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.272216 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "185bafa0-88c8-4439-b600-e6c906face05" (UID: "185bafa0-88c8-4439-b600-e6c906face05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.275561 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-config" (OuterVolumeSpecName: "config") pod "185bafa0-88c8-4439-b600-e6c906face05" (UID: "185bafa0-88c8-4439-b600-e6c906face05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.281797 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "185bafa0-88c8-4439-b600-e6c906face05" (UID: "185bafa0-88c8-4439-b600-e6c906face05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.336380 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.336415 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.336426 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.336434 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/185bafa0-88c8-4439-b600-e6c906face05-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.336444 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmldf\" (UniqueName: \"kubernetes.io/projected/185bafa0-88c8-4439-b600-e6c906face05-kube-api-access-bmldf\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.742552 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f998756c-btfjj" event={"ID":"185bafa0-88c8-4439-b600-e6c906face05","Type":"ContainerDied","Data":"4f4b82069c9f3270cd68c25a7efbf13b307091ea86dd6c1a954b4b563276f38c"} Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.742613 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f998756c-btfjj" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.742624 4789 scope.go:117] "RemoveContainer" containerID="eef18103d1408ab494bf22771526a9bffe1cf486e7539782cd4982d78e492bbe" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.764158 4789 scope.go:117] "RemoveContainer" containerID="108a401501389686f042ae7c47c8838cf6de7813a6a1b5e1b7100187068ae400" Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.780930 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f998756c-btfjj"] Dec 16 08:15:08 crc kubenswrapper[4789]: I1216 08:15:08.789174 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64f998756c-btfjj"] Dec 16 08:15:10 crc kubenswrapper[4789]: I1216 08:15:10.114307 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185bafa0-88c8-4439-b600-e6c906face05" path="/var/lib/kubelet/pods/185bafa0-88c8-4439-b600-e6c906face05/volumes" Dec 16 08:15:21 crc kubenswrapper[4789]: I1216 08:15:21.927962 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:15:21 crc kubenswrapper[4789]: I1216 08:15:21.928505 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:15:26 crc kubenswrapper[4789]: I1216 08:15:26.364050 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d9f7fc5b5-p8zdl" Dec 16 08:15:29 crc kubenswrapper[4789]: I1216 08:15:29.815866 4789 scope.go:117] "RemoveContainer" containerID="ed517f2381834c54c99a7e402bac8b0d9fb902b884854bb93612702a90eb4b4c" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.230004 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lndx6"] Dec 16 08:15:33 crc kubenswrapper[4789]: E1216 08:15:33.230971 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185bafa0-88c8-4439-b600-e6c906face05" containerName="init" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.230986 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="185bafa0-88c8-4439-b600-e6c906face05" containerName="init" Dec 16 08:15:33 crc kubenswrapper[4789]: E1216 08:15:33.231009 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185bafa0-88c8-4439-b600-e6c906face05" containerName="dnsmasq-dns" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.231016 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="185bafa0-88c8-4439-b600-e6c906face05" containerName="dnsmasq-dns" Dec 16 08:15:33 crc kubenswrapper[4789]: E1216 08:15:33.231030 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d87b09-4747-4165-b443-e19c0dfbbec8" containerName="collect-profiles" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.231039 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d87b09-4747-4165-b443-e19c0dfbbec8" containerName="collect-profiles" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.231224 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="185bafa0-88c8-4439-b600-e6c906face05" containerName="dnsmasq-dns" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.231251 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d87b09-4747-4165-b443-e19c0dfbbec8" containerName="collect-profiles" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.231939 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.240541 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lndx6"] Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.297940 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc820800-6a99-446e-b203-065a764d80a6-operator-scripts\") pod \"glance-db-create-lndx6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.298013 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmvhc\" (UniqueName: \"kubernetes.io/projected/dc820800-6a99-446e-b203-065a764d80a6-kube-api-access-bmvhc\") pod \"glance-db-create-lndx6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.399974 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-837b-account-create-update-tb5cw"] Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.401674 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.409965 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.421691 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc820800-6a99-446e-b203-065a764d80a6-operator-scripts\") pod \"glance-db-create-lndx6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.421771 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmvhc\" (UniqueName: \"kubernetes.io/projected/dc820800-6a99-446e-b203-065a764d80a6-kube-api-access-bmvhc\") pod \"glance-db-create-lndx6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.422872 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc820800-6a99-446e-b203-065a764d80a6-operator-scripts\") pod \"glance-db-create-lndx6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.445651 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-837b-account-create-update-tb5cw"] Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.462668 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmvhc\" (UniqueName: \"kubernetes.io/projected/dc820800-6a99-446e-b203-065a764d80a6-kube-api-access-bmvhc\") pod \"glance-db-create-lndx6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.523319 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/606167d1-6cc1-4177-ae73-8369b4bddcf4-operator-scripts\") pod \"glance-837b-account-create-update-tb5cw\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.523594 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pd2z\" (UniqueName: \"kubernetes.io/projected/606167d1-6cc1-4177-ae73-8369b4bddcf4-kube-api-access-5pd2z\") pod \"glance-837b-account-create-update-tb5cw\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.611533 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lndx6" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.627827 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pd2z\" (UniqueName: \"kubernetes.io/projected/606167d1-6cc1-4177-ae73-8369b4bddcf4-kube-api-access-5pd2z\") pod \"glance-837b-account-create-update-tb5cw\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.628051 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/606167d1-6cc1-4177-ae73-8369b4bddcf4-operator-scripts\") pod \"glance-837b-account-create-update-tb5cw\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.628772 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/606167d1-6cc1-4177-ae73-8369b4bddcf4-operator-scripts\") pod \"glance-837b-account-create-update-tb5cw\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.644503 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pd2z\" (UniqueName: \"kubernetes.io/projected/606167d1-6cc1-4177-ae73-8369b4bddcf4-kube-api-access-5pd2z\") pod \"glance-837b-account-create-update-tb5cw\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:33 crc kubenswrapper[4789]: I1216 08:15:33.725743 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.033869 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lndx6"] Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.176618 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-837b-account-create-update-tb5cw"] Dec 16 08:15:34 crc kubenswrapper[4789]: W1216 08:15:34.178799 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod606167d1_6cc1_4177_ae73_8369b4bddcf4.slice/crio-cdd89cadf3ab83af8af7004da52b0f934b7aee815655e543ec8055ee9d08bd45 WatchSource:0}: Error finding container cdd89cadf3ab83af8af7004da52b0f934b7aee815655e543ec8055ee9d08bd45: Status 404 returned error can't find the container with id cdd89cadf3ab83af8af7004da52b0f934b7aee815655e543ec8055ee9d08bd45 Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.958762 4789 generic.go:334] "Generic (PLEG): container finished" podID="dc820800-6a99-446e-b203-065a764d80a6" containerID="f3c6fe2b94bac5470c865964e9caed105820fecaf565aa48d53ed7b52ec234fb" exitCode=0 Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.958962 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lndx6" event={"ID":"dc820800-6a99-446e-b203-065a764d80a6","Type":"ContainerDied","Data":"f3c6fe2b94bac5470c865964e9caed105820fecaf565aa48d53ed7b52ec234fb"} Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.959160 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lndx6" event={"ID":"dc820800-6a99-446e-b203-065a764d80a6","Type":"ContainerStarted","Data":"35e268f896096ece4f6e1200a907d3cc62dc8294e0a6db77a0dc1810f6cc7d74"} Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.961086 4789 generic.go:334] "Generic (PLEG): container finished" podID="606167d1-6cc1-4177-ae73-8369b4bddcf4" containerID="2357d778a830c0b83497163681744e0d8eb8154927b1afdeee85558169b24107" exitCode=0 Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.961132 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-837b-account-create-update-tb5cw" event={"ID":"606167d1-6cc1-4177-ae73-8369b4bddcf4","Type":"ContainerDied","Data":"2357d778a830c0b83497163681744e0d8eb8154927b1afdeee85558169b24107"} Dec 16 08:15:34 crc kubenswrapper[4789]: I1216 08:15:34.961162 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-837b-account-create-update-tb5cw" event={"ID":"606167d1-6cc1-4177-ae73-8369b4bddcf4","Type":"ContainerStarted","Data":"cdd89cadf3ab83af8af7004da52b0f934b7aee815655e543ec8055ee9d08bd45"} Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.247386 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lndx6" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.327733 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.369489 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmvhc\" (UniqueName: \"kubernetes.io/projected/dc820800-6a99-446e-b203-065a764d80a6-kube-api-access-bmvhc\") pod \"dc820800-6a99-446e-b203-065a764d80a6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.369570 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc820800-6a99-446e-b203-065a764d80a6-operator-scripts\") pod \"dc820800-6a99-446e-b203-065a764d80a6\" (UID: \"dc820800-6a99-446e-b203-065a764d80a6\") " Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.370319 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc820800-6a99-446e-b203-065a764d80a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc820800-6a99-446e-b203-065a764d80a6" (UID: "dc820800-6a99-446e-b203-065a764d80a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.376127 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc820800-6a99-446e-b203-065a764d80a6-kube-api-access-bmvhc" (OuterVolumeSpecName: "kube-api-access-bmvhc") pod "dc820800-6a99-446e-b203-065a764d80a6" (UID: "dc820800-6a99-446e-b203-065a764d80a6"). InnerVolumeSpecName "kube-api-access-bmvhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.470964 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pd2z\" (UniqueName: \"kubernetes.io/projected/606167d1-6cc1-4177-ae73-8369b4bddcf4-kube-api-access-5pd2z\") pod \"606167d1-6cc1-4177-ae73-8369b4bddcf4\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.471105 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/606167d1-6cc1-4177-ae73-8369b4bddcf4-operator-scripts\") pod \"606167d1-6cc1-4177-ae73-8369b4bddcf4\" (UID: \"606167d1-6cc1-4177-ae73-8369b4bddcf4\") " Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.471612 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmvhc\" (UniqueName: \"kubernetes.io/projected/dc820800-6a99-446e-b203-065a764d80a6-kube-api-access-bmvhc\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.471636 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc820800-6a99-446e-b203-065a764d80a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.472015 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/606167d1-6cc1-4177-ae73-8369b4bddcf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "606167d1-6cc1-4177-ae73-8369b4bddcf4" (UID: "606167d1-6cc1-4177-ae73-8369b4bddcf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.473783 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606167d1-6cc1-4177-ae73-8369b4bddcf4-kube-api-access-5pd2z" (OuterVolumeSpecName: "kube-api-access-5pd2z") pod "606167d1-6cc1-4177-ae73-8369b4bddcf4" (UID: "606167d1-6cc1-4177-ae73-8369b4bddcf4"). InnerVolumeSpecName "kube-api-access-5pd2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.573828 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/606167d1-6cc1-4177-ae73-8369b4bddcf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.573874 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pd2z\" (UniqueName: \"kubernetes.io/projected/606167d1-6cc1-4177-ae73-8369b4bddcf4-kube-api-access-5pd2z\") on node \"crc\" DevicePath \"\"" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.976887 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-837b-account-create-update-tb5cw" event={"ID":"606167d1-6cc1-4177-ae73-8369b4bddcf4","Type":"ContainerDied","Data":"cdd89cadf3ab83af8af7004da52b0f934b7aee815655e543ec8055ee9d08bd45"} Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.977250 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdd89cadf3ab83af8af7004da52b0f934b7aee815655e543ec8055ee9d08bd45" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.976934 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-837b-account-create-update-tb5cw" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.978624 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lndx6" event={"ID":"dc820800-6a99-446e-b203-065a764d80a6","Type":"ContainerDied","Data":"35e268f896096ece4f6e1200a907d3cc62dc8294e0a6db77a0dc1810f6cc7d74"} Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.978652 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lndx6" Dec 16 08:15:36 crc kubenswrapper[4789]: I1216 08:15:36.978659 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35e268f896096ece4f6e1200a907d3cc62dc8294e0a6db77a0dc1810f6cc7d74" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.432383 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ffxcd"] Dec 16 08:15:38 crc kubenswrapper[4789]: E1216 08:15:38.432765 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc820800-6a99-446e-b203-065a764d80a6" containerName="mariadb-database-create" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.432782 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc820800-6a99-446e-b203-065a764d80a6" containerName="mariadb-database-create" Dec 16 08:15:38 crc kubenswrapper[4789]: E1216 08:15:38.432795 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606167d1-6cc1-4177-ae73-8369b4bddcf4" containerName="mariadb-account-create-update" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.432803 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="606167d1-6cc1-4177-ae73-8369b4bddcf4" containerName="mariadb-account-create-update" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.433019 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc820800-6a99-446e-b203-065a764d80a6" containerName="mariadb-database-create" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.433042 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="606167d1-6cc1-4177-ae73-8369b4bddcf4" containerName="mariadb-account-create-update" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.433686 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.436307 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.436525 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6q9tg" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.456253 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ffxcd"] Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.627365 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-combined-ca-bundle\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.627458 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-db-sync-config-data\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.627552 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-config-data\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.627634 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqwv4\" (UniqueName: \"kubernetes.io/projected/6387d858-ac8d-4d8d-b910-f12598ffc6bb-kube-api-access-zqwv4\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.729498 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-config-data\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.729609 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqwv4\" (UniqueName: \"kubernetes.io/projected/6387d858-ac8d-4d8d-b910-f12598ffc6bb-kube-api-access-zqwv4\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.729700 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-combined-ca-bundle\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.729757 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-db-sync-config-data\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.735972 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-db-sync-config-data\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.737826 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-combined-ca-bundle\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.741727 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-config-data\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:38 crc kubenswrapper[4789]: I1216 08:15:38.752024 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqwv4\" (UniqueName: \"kubernetes.io/projected/6387d858-ac8d-4d8d-b910-f12598ffc6bb-kube-api-access-zqwv4\") pod \"glance-db-sync-ffxcd\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:39 crc kubenswrapper[4789]: I1216 08:15:39.050650 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ffxcd" Dec 16 08:15:39 crc kubenswrapper[4789]: I1216 08:15:39.529950 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ffxcd"] Dec 16 08:15:39 crc kubenswrapper[4789]: W1216 08:15:39.535105 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6387d858_ac8d_4d8d_b910_f12598ffc6bb.slice/crio-a866da5b3af2fb9b9dc26fe479b79e35bcb1b1339034e6e8cc774a09c611996a WatchSource:0}: Error finding container a866da5b3af2fb9b9dc26fe479b79e35bcb1b1339034e6e8cc774a09c611996a: Status 404 returned error can't find the container with id a866da5b3af2fb9b9dc26fe479b79e35bcb1b1339034e6e8cc774a09c611996a Dec 16 08:15:39 crc kubenswrapper[4789]: I1216 08:15:39.537347 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:15:40 crc kubenswrapper[4789]: I1216 08:15:40.002429 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ffxcd" event={"ID":"6387d858-ac8d-4d8d-b910-f12598ffc6bb","Type":"ContainerStarted","Data":"a866da5b3af2fb9b9dc26fe479b79e35bcb1b1339034e6e8cc774a09c611996a"} Dec 16 08:15:51 crc kubenswrapper[4789]: I1216 08:15:51.928344 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:15:51 crc kubenswrapper[4789]: I1216 08:15:51.928979 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:15:51 crc kubenswrapper[4789]: I1216 08:15:51.929081 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:15:51 crc kubenswrapper[4789]: I1216 08:15:51.930009 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"263dc2616d80f4ff03c2ff79d40529ccbb3a132a477a0c3fb859bf304de469fc"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:15:51 crc kubenswrapper[4789]: I1216 08:15:51.930072 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://263dc2616d80f4ff03c2ff79d40529ccbb3a132a477a0c3fb859bf304de469fc" gracePeriod=600 Dec 16 08:15:53 crc kubenswrapper[4789]: I1216 08:15:53.104324 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="263dc2616d80f4ff03c2ff79d40529ccbb3a132a477a0c3fb859bf304de469fc" exitCode=0 Dec 16 08:15:53 crc kubenswrapper[4789]: I1216 08:15:53.104423 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"263dc2616d80f4ff03c2ff79d40529ccbb3a132a477a0c3fb859bf304de469fc"} Dec 16 08:15:53 crc kubenswrapper[4789]: I1216 08:15:53.104799 4789 scope.go:117] "RemoveContainer" containerID="e69cc730fec5c0ed5519a8eaae1cd89896a17324d6d55d31bbf1c8332967d193" Dec 16 08:15:59 crc kubenswrapper[4789]: I1216 08:15:59.146629 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ffxcd" event={"ID":"6387d858-ac8d-4d8d-b910-f12598ffc6bb","Type":"ContainerStarted","Data":"90919f111ce167d64c47fd72e53d47d04e018b50823f690529d052457320815e"} Dec 16 08:15:59 crc kubenswrapper[4789]: I1216 08:15:59.149314 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987"} Dec 16 08:15:59 crc kubenswrapper[4789]: I1216 08:15:59.168235 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ffxcd" podStartSLOduration=2.332718925 podStartE2EDuration="21.168217878s" podCreationTimestamp="2025-12-16 08:15:38 +0000 UTC" firstStartedPulling="2025-12-16 08:15:39.537109164 +0000 UTC m=+5077.798996793" lastFinishedPulling="2025-12-16 08:15:58.372608117 +0000 UTC m=+5096.634495746" observedRunningTime="2025-12-16 08:15:59.162321224 +0000 UTC m=+5097.424208883" watchObservedRunningTime="2025-12-16 08:15:59.168217878 +0000 UTC m=+5097.430105507" Dec 16 08:16:02 crc kubenswrapper[4789]: I1216 08:16:02.177441 4789 generic.go:334] "Generic (PLEG): container finished" podID="6387d858-ac8d-4d8d-b910-f12598ffc6bb" containerID="90919f111ce167d64c47fd72e53d47d04e018b50823f690529d052457320815e" exitCode=0 Dec 16 08:16:02 crc kubenswrapper[4789]: I1216 08:16:02.177513 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ffxcd" event={"ID":"6387d858-ac8d-4d8d-b910-f12598ffc6bb","Type":"ContainerDied","Data":"90919f111ce167d64c47fd72e53d47d04e018b50823f690529d052457320815e"} Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.529842 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ffxcd" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.606667 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqwv4\" (UniqueName: \"kubernetes.io/projected/6387d858-ac8d-4d8d-b910-f12598ffc6bb-kube-api-access-zqwv4\") pod \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.606706 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-combined-ca-bundle\") pod \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.606779 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-db-sync-config-data\") pod \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.606805 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-config-data\") pod \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\" (UID: \"6387d858-ac8d-4d8d-b910-f12598ffc6bb\") " Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.612053 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6387d858-ac8d-4d8d-b910-f12598ffc6bb" (UID: "6387d858-ac8d-4d8d-b910-f12598ffc6bb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.614060 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6387d858-ac8d-4d8d-b910-f12598ffc6bb-kube-api-access-zqwv4" (OuterVolumeSpecName: "kube-api-access-zqwv4") pod "6387d858-ac8d-4d8d-b910-f12598ffc6bb" (UID: "6387d858-ac8d-4d8d-b910-f12598ffc6bb"). InnerVolumeSpecName "kube-api-access-zqwv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.646113 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6387d858-ac8d-4d8d-b910-f12598ffc6bb" (UID: "6387d858-ac8d-4d8d-b910-f12598ffc6bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.671586 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-config-data" (OuterVolumeSpecName: "config-data") pod "6387d858-ac8d-4d8d-b910-f12598ffc6bb" (UID: "6387d858-ac8d-4d8d-b910-f12598ffc6bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.709991 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqwv4\" (UniqueName: \"kubernetes.io/projected/6387d858-ac8d-4d8d-b910-f12598ffc6bb-kube-api-access-zqwv4\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.710023 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.710034 4789 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:03 crc kubenswrapper[4789]: I1216 08:16:03.710045 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387d858-ac8d-4d8d-b910-f12598ffc6bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.198470 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ffxcd" event={"ID":"6387d858-ac8d-4d8d-b910-f12598ffc6bb","Type":"ContainerDied","Data":"a866da5b3af2fb9b9dc26fe479b79e35bcb1b1339034e6e8cc774a09c611996a"} Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.198543 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a866da5b3af2fb9b9dc26fe479b79e35bcb1b1339034e6e8cc774a09c611996a" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.198730 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ffxcd" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.477764 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:04 crc kubenswrapper[4789]: E1216 08:16:04.478191 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387d858-ac8d-4d8d-b910-f12598ffc6bb" containerName="glance-db-sync" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.478210 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387d858-ac8d-4d8d-b910-f12598ffc6bb" containerName="glance-db-sync" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.478432 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387d858-ac8d-4d8d-b910-f12598ffc6bb" containerName="glance-db-sync" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.479546 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.481987 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.482279 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.482438 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6q9tg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.482671 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.498034 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.591738 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869b597f9f-w6kfg"] Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.594234 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.616032 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869b597f9f-w6kfg"] Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.633054 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfnk\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-kube-api-access-mgfnk\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.633123 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-config-data\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.633185 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.633243 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-logs\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.633291 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-scripts\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.633311 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-ceph\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.633334 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.683016 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.684414 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.686111 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.702552 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.734945 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-config-data\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.735880 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-config\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.735934 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-dns-svc\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.735992 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736012 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-nb\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736065 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-sb\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736126 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-logs\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736184 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-scripts\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736217 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-ceph\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736285 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736354 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdp42\" (UniqueName: \"kubernetes.io/projected/b32cd691-9873-4057-9f9a-15b65c718cba-kube-api-access-qdp42\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.736429 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfnk\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-kube-api-access-mgfnk\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.737030 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.737399 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-logs\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.744714 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-config-data\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.744750 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.745281 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-ceph\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.755050 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-scripts\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.759207 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfnk\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-kube-api-access-mgfnk\") pod \"glance-default-external-api-0\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.796475 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.837780 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-config\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.837826 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-dns-svc\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.837874 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-nb\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.837926 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-sb\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.837976 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.838016 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.838059 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.838086 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.838128 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdp42\" (UniqueName: \"kubernetes.io/projected/b32cd691-9873-4057-9f9a-15b65c718cba-kube-api-access-qdp42\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.838155 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.838178 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh64b\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-kube-api-access-hh64b\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.838212 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.839642 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-config\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.839815 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-dns-svc\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.841689 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-sb\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.842517 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-nb\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.863199 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdp42\" (UniqueName: \"kubernetes.io/projected/b32cd691-9873-4057-9f9a-15b65c718cba-kube-api-access-qdp42\") pod \"dnsmasq-dns-869b597f9f-w6kfg\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.909348 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.939679 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.939730 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.939781 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.939810 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.939852 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.939875 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh64b\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-kube-api-access-hh64b\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.939904 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.941181 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.941263 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-logs\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.946600 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.947109 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.947755 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.948294 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:04 crc kubenswrapper[4789]: I1216 08:16:04.973156 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh64b\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-kube-api-access-hh64b\") pod \"glance-default-internal-api-0\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:05 crc kubenswrapper[4789]: I1216 08:16:05.001175 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:05 crc kubenswrapper[4789]: W1216 08:16:05.197942 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb32cd691_9873_4057_9f9a_15b65c718cba.slice/crio-8f9af017b864efb29a9c6fe83596c5ea4260c94db1038ce60ca00063a712a27c WatchSource:0}: Error finding container 8f9af017b864efb29a9c6fe83596c5ea4260c94db1038ce60ca00063a712a27c: Status 404 returned error can't find the container with id 8f9af017b864efb29a9c6fe83596c5ea4260c94db1038ce60ca00063a712a27c Dec 16 08:16:05 crc kubenswrapper[4789]: I1216 08:16:05.199321 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869b597f9f-w6kfg"] Dec 16 08:16:05 crc kubenswrapper[4789]: I1216 08:16:05.210378 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" event={"ID":"b32cd691-9873-4057-9f9a-15b65c718cba","Type":"ContainerStarted","Data":"8f9af017b864efb29a9c6fe83596c5ea4260c94db1038ce60ca00063a712a27c"} Dec 16 08:16:05 crc kubenswrapper[4789]: I1216 08:16:05.377589 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:05 crc kubenswrapper[4789]: I1216 08:16:05.584468 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:05 crc kubenswrapper[4789]: I1216 08:16:05.675101 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:06 crc kubenswrapper[4789]: I1216 08:16:06.243269 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7abb7c09-73f0-4ac8-b44b-067170173fbc","Type":"ContainerStarted","Data":"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e"} Dec 16 08:16:06 crc kubenswrapper[4789]: I1216 08:16:06.243870 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7abb7c09-73f0-4ac8-b44b-067170173fbc","Type":"ContainerStarted","Data":"1658d57d4720a1a444fa34b9e7c6c8c132a660f1823f7b8d08c1918a18ef2a6c"} Dec 16 08:16:06 crc kubenswrapper[4789]: I1216 08:16:06.258286 4789 generic.go:334] "Generic (PLEG): container finished" podID="b32cd691-9873-4057-9f9a-15b65c718cba" containerID="66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80" exitCode=0 Dec 16 08:16:06 crc kubenswrapper[4789]: I1216 08:16:06.258343 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" event={"ID":"b32cd691-9873-4057-9f9a-15b65c718cba","Type":"ContainerDied","Data":"66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80"} Dec 16 08:16:06 crc kubenswrapper[4789]: I1216 08:16:06.260984 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea7f9a1e-dc1b-43db-a427-5f003cf545a3","Type":"ContainerStarted","Data":"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154"} Dec 16 08:16:06 crc kubenswrapper[4789]: I1216 08:16:06.261055 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea7f9a1e-dc1b-43db-a427-5f003cf545a3","Type":"ContainerStarted","Data":"8d0f9eb1b102169c11c2b2a46e447115140eb2b9d00a5e9a33cfdcc47169bae1"} Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.271660 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" event={"ID":"b32cd691-9873-4057-9f9a-15b65c718cba","Type":"ContainerStarted","Data":"12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9"} Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.272012 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.275138 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea7f9a1e-dc1b-43db-a427-5f003cf545a3","Type":"ContainerStarted","Data":"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1"} Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.275224 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-log" containerID="cri-o://c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154" gracePeriod=30 Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.275282 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-httpd" containerID="cri-o://504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1" gracePeriod=30 Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.278931 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7abb7c09-73f0-4ac8-b44b-067170173fbc","Type":"ContainerStarted","Data":"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e"} Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.295378 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" podStartSLOduration=3.295357449 podStartE2EDuration="3.295357449s" podCreationTimestamp="2025-12-16 08:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:16:07.288182304 +0000 UTC m=+5105.550069943" watchObservedRunningTime="2025-12-16 08:16:07.295357449 +0000 UTC m=+5105.557245078" Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.317203 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.317177321 podStartE2EDuration="3.317177321s" podCreationTimestamp="2025-12-16 08:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:16:07.313945833 +0000 UTC m=+5105.575833472" watchObservedRunningTime="2025-12-16 08:16:07.317177321 +0000 UTC m=+5105.579064950" Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.370170 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.370142096 podStartE2EDuration="3.370142096s" podCreationTimestamp="2025-12-16 08:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:16:07.335507509 +0000 UTC m=+5105.597395128" watchObservedRunningTime="2025-12-16 08:16:07.370142096 +0000 UTC m=+5105.632029725" Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.394119 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:07 crc kubenswrapper[4789]: I1216 08:16:07.883092 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.001771 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-scripts\") pod \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.001975 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-config-data\") pod \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.002027 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgfnk\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-kube-api-access-mgfnk\") pod \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.002065 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-ceph\") pod \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.002100 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-httpd-run\") pod \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.002131 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-logs\") pod \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.002176 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-combined-ca-bundle\") pod \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\" (UID: \"ea7f9a1e-dc1b-43db-a427-5f003cf545a3\") " Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.002880 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-logs" (OuterVolumeSpecName: "logs") pod "ea7f9a1e-dc1b-43db-a427-5f003cf545a3" (UID: "ea7f9a1e-dc1b-43db-a427-5f003cf545a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.003043 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ea7f9a1e-dc1b-43db-a427-5f003cf545a3" (UID: "ea7f9a1e-dc1b-43db-a427-5f003cf545a3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.007515 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-ceph" (OuterVolumeSpecName: "ceph") pod "ea7f9a1e-dc1b-43db-a427-5f003cf545a3" (UID: "ea7f9a1e-dc1b-43db-a427-5f003cf545a3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.008214 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-kube-api-access-mgfnk" (OuterVolumeSpecName: "kube-api-access-mgfnk") pod "ea7f9a1e-dc1b-43db-a427-5f003cf545a3" (UID: "ea7f9a1e-dc1b-43db-a427-5f003cf545a3"). InnerVolumeSpecName "kube-api-access-mgfnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.008402 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-scripts" (OuterVolumeSpecName: "scripts") pod "ea7f9a1e-dc1b-43db-a427-5f003cf545a3" (UID: "ea7f9a1e-dc1b-43db-a427-5f003cf545a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.028101 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea7f9a1e-dc1b-43db-a427-5f003cf545a3" (UID: "ea7f9a1e-dc1b-43db-a427-5f003cf545a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.048890 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-config-data" (OuterVolumeSpecName: "config-data") pod "ea7f9a1e-dc1b-43db-a427-5f003cf545a3" (UID: "ea7f9a1e-dc1b-43db-a427-5f003cf545a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.103929 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.103970 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.103982 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.103995 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgfnk\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-kube-api-access-mgfnk\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.104011 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.104021 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.104032 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea7f9a1e-dc1b-43db-a427-5f003cf545a3-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.289272 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerID="504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1" exitCode=0 Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.289303 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerID="c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154" exitCode=143 Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.289323 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea7f9a1e-dc1b-43db-a427-5f003cf545a3","Type":"ContainerDied","Data":"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1"} Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.289354 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.289386 4789 scope.go:117] "RemoveContainer" containerID="504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.289371 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea7f9a1e-dc1b-43db-a427-5f003cf545a3","Type":"ContainerDied","Data":"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154"} Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.289515 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ea7f9a1e-dc1b-43db-a427-5f003cf545a3","Type":"ContainerDied","Data":"8d0f9eb1b102169c11c2b2a46e447115140eb2b9d00a5e9a33cfdcc47169bae1"} Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.312146 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.320179 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.321331 4789 scope.go:117] "RemoveContainer" containerID="c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.345881 4789 scope.go:117] "RemoveContainer" containerID="504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1" Dec 16 08:16:08 crc kubenswrapper[4789]: E1216 08:16:08.347437 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1\": container with ID starting with 504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1 not found: ID does not exist" containerID="504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.347472 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1"} err="failed to get container status \"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1\": rpc error: code = NotFound desc = could not find container \"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1\": container with ID starting with 504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1 not found: ID does not exist" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.347495 4789 scope.go:117] "RemoveContainer" containerID="c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154" Dec 16 08:16:08 crc kubenswrapper[4789]: E1216 08:16:08.347742 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154\": container with ID starting with c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154 not found: ID does not exist" containerID="c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.347782 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154"} err="failed to get container status \"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154\": rpc error: code = NotFound desc = could not find container \"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154\": container with ID starting with c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154 not found: ID does not exist" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.347811 4789 scope.go:117] "RemoveContainer" containerID="504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.348035 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1"} err="failed to get container status \"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1\": rpc error: code = NotFound desc = could not find container \"504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1\": container with ID starting with 504ca98140f56b896733699e13500b0c2ad17ac1778c9446b11b3b8417cd23a1 not found: ID does not exist" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.348052 4789 scope.go:117] "RemoveContainer" containerID="c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.348213 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154"} err="failed to get container status \"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154\": rpc error: code = NotFound desc = could not find container \"c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154\": container with ID starting with c0e5e3d53f985fdf35a69ba8d7eaf0cb94623d6abaf8756ed681ae031a108154 not found: ID does not exist" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.349126 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:08 crc kubenswrapper[4789]: E1216 08:16:08.349540 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-log" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.349558 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-log" Dec 16 08:16:08 crc kubenswrapper[4789]: E1216 08:16:08.349567 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-httpd" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.349574 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-httpd" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.349793 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-log" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.349804 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" containerName="glance-httpd" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.350726 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.360044 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.384494 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.513454 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.513513 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-logs\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.513610 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8glzg\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-kube-api-access-8glzg\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.513705 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.513727 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-ceph\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.513840 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.513875 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.615754 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.615796 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-logs\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.615824 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8glzg\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-kube-api-access-8glzg\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.615856 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.615873 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-ceph\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.615901 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.615967 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.616235 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.616355 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-logs\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.619889 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.619995 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.620592 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-ceph\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.620791 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.632864 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8glzg\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-kube-api-access-8glzg\") pod \"glance-default-external-api-0\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " pod="openstack/glance-default-external-api-0" Dec 16 08:16:08 crc kubenswrapper[4789]: I1216 08:16:08.720119 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:16:09 crc kubenswrapper[4789]: I1216 08:16:09.237449 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:16:09 crc kubenswrapper[4789]: I1216 08:16:09.301897 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea","Type":"ContainerStarted","Data":"5da7599cf296c984fb95ed5a86379a55e662eedc72901c6ed552ba2dc07bdc67"} Dec 16 08:16:09 crc kubenswrapper[4789]: I1216 08:16:09.302109 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-log" containerID="cri-o://fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e" gracePeriod=30 Dec 16 08:16:09 crc kubenswrapper[4789]: I1216 08:16:09.302129 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-httpd" containerID="cri-o://d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e" gracePeriod=30 Dec 16 08:16:09 crc kubenswrapper[4789]: I1216 08:16:09.952850 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.117120 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea7f9a1e-dc1b-43db-a427-5f003cf545a3" path="/var/lib/kubelet/pods/ea7f9a1e-dc1b-43db-a427-5f003cf545a3/volumes" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.137835 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-ceph\") pod \"7abb7c09-73f0-4ac8-b44b-067170173fbc\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.137878 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-combined-ca-bundle\") pod \"7abb7c09-73f0-4ac8-b44b-067170173fbc\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.138045 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh64b\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-kube-api-access-hh64b\") pod \"7abb7c09-73f0-4ac8-b44b-067170173fbc\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.138175 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-logs\") pod \"7abb7c09-73f0-4ac8-b44b-067170173fbc\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.138200 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-config-data\") pod \"7abb7c09-73f0-4ac8-b44b-067170173fbc\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.138271 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-scripts\") pod \"7abb7c09-73f0-4ac8-b44b-067170173fbc\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.138313 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-httpd-run\") pod \"7abb7c09-73f0-4ac8-b44b-067170173fbc\" (UID: \"7abb7c09-73f0-4ac8-b44b-067170173fbc\") " Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.138868 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7abb7c09-73f0-4ac8-b44b-067170173fbc" (UID: "7abb7c09-73f0-4ac8-b44b-067170173fbc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.139145 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-logs" (OuterVolumeSpecName: "logs") pod "7abb7c09-73f0-4ac8-b44b-067170173fbc" (UID: "7abb7c09-73f0-4ac8-b44b-067170173fbc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.145481 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-scripts" (OuterVolumeSpecName: "scripts") pod "7abb7c09-73f0-4ac8-b44b-067170173fbc" (UID: "7abb7c09-73f0-4ac8-b44b-067170173fbc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.149831 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-kube-api-access-hh64b" (OuterVolumeSpecName: "kube-api-access-hh64b") pod "7abb7c09-73f0-4ac8-b44b-067170173fbc" (UID: "7abb7c09-73f0-4ac8-b44b-067170173fbc"). InnerVolumeSpecName "kube-api-access-hh64b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.151183 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-ceph" (OuterVolumeSpecName: "ceph") pod "7abb7c09-73f0-4ac8-b44b-067170173fbc" (UID: "7abb7c09-73f0-4ac8-b44b-067170173fbc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.165488 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7abb7c09-73f0-4ac8-b44b-067170173fbc" (UID: "7abb7c09-73f0-4ac8-b44b-067170173fbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.193317 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-config-data" (OuterVolumeSpecName: "config-data") pod "7abb7c09-73f0-4ac8-b44b-067170173fbc" (UID: "7abb7c09-73f0-4ac8-b44b-067170173fbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.240694 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.240751 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.240770 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.240790 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.240810 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh64b\" (UniqueName: \"kubernetes.io/projected/7abb7c09-73f0-4ac8-b44b-067170173fbc-kube-api-access-hh64b\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.240829 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7abb7c09-73f0-4ac8-b44b-067170173fbc-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.240846 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb7c09-73f0-4ac8-b44b-067170173fbc-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.313894 4789 generic.go:334] "Generic (PLEG): container finished" podID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerID="d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e" exitCode=0 Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.313959 4789 generic.go:334] "Generic (PLEG): container finished" podID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerID="fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e" exitCode=143 Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.313960 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7abb7c09-73f0-4ac8-b44b-067170173fbc","Type":"ContainerDied","Data":"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e"} Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.314010 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7abb7c09-73f0-4ac8-b44b-067170173fbc","Type":"ContainerDied","Data":"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e"} Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.314022 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.314043 4789 scope.go:117] "RemoveContainer" containerID="d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.314029 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7abb7c09-73f0-4ac8-b44b-067170173fbc","Type":"ContainerDied","Data":"1658d57d4720a1a444fa34b9e7c6c8c132a660f1823f7b8d08c1918a18ef2a6c"} Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.317809 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea","Type":"ContainerStarted","Data":"68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588"} Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.354428 4789 scope.go:117] "RemoveContainer" containerID="fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.361322 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.370191 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.378806 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:10 crc kubenswrapper[4789]: E1216 08:16:10.379324 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-log" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.379371 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-log" Dec 16 08:16:10 crc kubenswrapper[4789]: E1216 08:16:10.379407 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-httpd" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.379441 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-httpd" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.379689 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-log" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.379709 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" containerName="glance-httpd" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.381550 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.386085 4789 scope.go:117] "RemoveContainer" containerID="d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.386538 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 08:16:10 crc kubenswrapper[4789]: E1216 08:16:10.387073 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e\": container with ID starting with d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e not found: ID does not exist" containerID="d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.387103 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e"} err="failed to get container status \"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e\": rpc error: code = NotFound desc = could not find container \"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e\": container with ID starting with d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e not found: ID does not exist" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.387122 4789 scope.go:117] "RemoveContainer" containerID="fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e" Dec 16 08:16:10 crc kubenswrapper[4789]: E1216 08:16:10.387390 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e\": container with ID starting with fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e not found: ID does not exist" containerID="fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.387417 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e"} err="failed to get container status \"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e\": rpc error: code = NotFound desc = could not find container \"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e\": container with ID starting with fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e not found: ID does not exist" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.387435 4789 scope.go:117] "RemoveContainer" containerID="d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.387749 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e"} err="failed to get container status \"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e\": rpc error: code = NotFound desc = could not find container \"d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e\": container with ID starting with d26711e05a2c5142eba3ba0493c4eee11b8509a30afee9ea5bdcab19607d214e not found: ID does not exist" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.387776 4789 scope.go:117] "RemoveContainer" containerID="fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.388039 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e"} err="failed to get container status \"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e\": rpc error: code = NotFound desc = could not find container \"fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e\": container with ID starting with fadaaadcce19d0aa11ed13db2a3080c0f768cc076e7c20b473a4d9ae0540575e not found: ID does not exist" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.423057 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.557494 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.557548 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.557607 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.557626 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.557652 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.557704 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5p5f\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-kube-api-access-b5p5f\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.557837 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.660788 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.660905 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.660987 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.661044 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5p5f\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-kube-api-access-b5p5f\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.661104 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.661198 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.661249 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.661820 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.677642 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.662134 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.680427 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.687549 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.687575 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.688948 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5p5f\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-kube-api-access-b5p5f\") pod \"glance-default-internal-api-0\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:16:10 crc kubenswrapper[4789]: I1216 08:16:10.718588 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:11 crc kubenswrapper[4789]: I1216 08:16:11.025122 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:16:11 crc kubenswrapper[4789]: W1216 08:16:11.029672 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc74ae8fa_e4f5_4138_9b8c_356ff2345ba3.slice/crio-001346f108f9b66d8b594c1123591540e02bc7bc39c9ef6658e42e7e97968220 WatchSource:0}: Error finding container 001346f108f9b66d8b594c1123591540e02bc7bc39c9ef6658e42e7e97968220: Status 404 returned error can't find the container with id 001346f108f9b66d8b594c1123591540e02bc7bc39c9ef6658e42e7e97968220 Dec 16 08:16:11 crc kubenswrapper[4789]: I1216 08:16:11.327895 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3","Type":"ContainerStarted","Data":"001346f108f9b66d8b594c1123591540e02bc7bc39c9ef6658e42e7e97968220"} Dec 16 08:16:11 crc kubenswrapper[4789]: I1216 08:16:11.331391 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea","Type":"ContainerStarted","Data":"25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f"} Dec 16 08:16:11 crc kubenswrapper[4789]: I1216 08:16:11.355903 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.355884019 podStartE2EDuration="3.355884019s" podCreationTimestamp="2025-12-16 08:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:16:11.348214831 +0000 UTC m=+5109.610102470" watchObservedRunningTime="2025-12-16 08:16:11.355884019 +0000 UTC m=+5109.617771648" Dec 16 08:16:12 crc kubenswrapper[4789]: I1216 08:16:12.133853 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7abb7c09-73f0-4ac8-b44b-067170173fbc" path="/var/lib/kubelet/pods/7abb7c09-73f0-4ac8-b44b-067170173fbc/volumes" Dec 16 08:16:12 crc kubenswrapper[4789]: I1216 08:16:12.340816 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3","Type":"ContainerStarted","Data":"f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1"} Dec 16 08:16:12 crc kubenswrapper[4789]: I1216 08:16:12.341374 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3","Type":"ContainerStarted","Data":"67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae"} Dec 16 08:16:12 crc kubenswrapper[4789]: I1216 08:16:12.361281 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.361261916 podStartE2EDuration="2.361261916s" podCreationTimestamp="2025-12-16 08:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:16:12.356263963 +0000 UTC m=+5110.618151592" watchObservedRunningTime="2025-12-16 08:16:12.361261916 +0000 UTC m=+5110.623149545" Dec 16 08:16:14 crc kubenswrapper[4789]: I1216 08:16:14.911166 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:14 crc kubenswrapper[4789]: I1216 08:16:14.986347 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58d7b6d9-vfrgx"] Dec 16 08:16:14 crc kubenswrapper[4789]: I1216 08:16:14.986599 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerName="dnsmasq-dns" containerID="cri-o://64ddee54f571d1f9d6e1058c3b9ce8d332686923054b3faed4f534e6a8f5d59c" gracePeriod=10 Dec 16 08:16:15 crc kubenswrapper[4789]: I1216 08:16:15.379984 4789 generic.go:334] "Generic (PLEG): container finished" podID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerID="64ddee54f571d1f9d6e1058c3b9ce8d332686923054b3faed4f534e6a8f5d59c" exitCode=0 Dec 16 08:16:15 crc kubenswrapper[4789]: I1216 08:16:15.380370 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" event={"ID":"272bbed9-b63b-4ecc-b85f-434030f27a80","Type":"ContainerDied","Data":"64ddee54f571d1f9d6e1058c3b9ce8d332686923054b3faed4f534e6a8f5d59c"} Dec 16 08:16:16 crc kubenswrapper[4789]: I1216 08:16:16.282234 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.35:5353: connect: connection refused" Dec 16 08:16:18 crc kubenswrapper[4789]: I1216 08:16:18.720704 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 08:16:18 crc kubenswrapper[4789]: I1216 08:16:18.721024 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 08:16:18 crc kubenswrapper[4789]: I1216 08:16:18.747643 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 08:16:18 crc kubenswrapper[4789]: I1216 08:16:18.777024 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.416970 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.417024 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.490965 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.526789 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-nb\") pod \"272bbed9-b63b-4ecc-b85f-434030f27a80\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.527007 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56vk5\" (UniqueName: \"kubernetes.io/projected/272bbed9-b63b-4ecc-b85f-434030f27a80-kube-api-access-56vk5\") pod \"272bbed9-b63b-4ecc-b85f-434030f27a80\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.527050 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-sb\") pod \"272bbed9-b63b-4ecc-b85f-434030f27a80\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.527089 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-config\") pod \"272bbed9-b63b-4ecc-b85f-434030f27a80\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.527130 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-dns-svc\") pod \"272bbed9-b63b-4ecc-b85f-434030f27a80\" (UID: \"272bbed9-b63b-4ecc-b85f-434030f27a80\") " Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.538221 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272bbed9-b63b-4ecc-b85f-434030f27a80-kube-api-access-56vk5" (OuterVolumeSpecName: "kube-api-access-56vk5") pod "272bbed9-b63b-4ecc-b85f-434030f27a80" (UID: "272bbed9-b63b-4ecc-b85f-434030f27a80"). InnerVolumeSpecName "kube-api-access-56vk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.576251 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-config" (OuterVolumeSpecName: "config") pod "272bbed9-b63b-4ecc-b85f-434030f27a80" (UID: "272bbed9-b63b-4ecc-b85f-434030f27a80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.578828 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "272bbed9-b63b-4ecc-b85f-434030f27a80" (UID: "272bbed9-b63b-4ecc-b85f-434030f27a80"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.586429 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "272bbed9-b63b-4ecc-b85f-434030f27a80" (UID: "272bbed9-b63b-4ecc-b85f-434030f27a80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.588819 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "272bbed9-b63b-4ecc-b85f-434030f27a80" (UID: "272bbed9-b63b-4ecc-b85f-434030f27a80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.629332 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.629386 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56vk5\" (UniqueName: \"kubernetes.io/projected/272bbed9-b63b-4ecc-b85f-434030f27a80-kube-api-access-56vk5\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.629397 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.629406 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:19 crc kubenswrapper[4789]: I1216 08:16:19.629414 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/272bbed9-b63b-4ecc-b85f-434030f27a80-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.427807 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" event={"ID":"272bbed9-b63b-4ecc-b85f-434030f27a80","Type":"ContainerDied","Data":"241d44117bec03976c0371a1ceefe430a4819ba4a9837c42bee00834f85659a5"} Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.427891 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d7b6d9-vfrgx" Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.427952 4789 scope.go:117] "RemoveContainer" containerID="64ddee54f571d1f9d6e1058c3b9ce8d332686923054b3faed4f534e6a8f5d59c" Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.462413 4789 scope.go:117] "RemoveContainer" containerID="8cfb5362fa2c50f67ba70bde43c016e5b44e4f04b0df6b374ce243a69a2214b9" Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.470970 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58d7b6d9-vfrgx"] Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.482701 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58d7b6d9-vfrgx"] Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.718964 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.719027 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.745274 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:20 crc kubenswrapper[4789]: I1216 08:16:20.757448 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:21 crc kubenswrapper[4789]: I1216 08:16:21.353262 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 08:16:21 crc kubenswrapper[4789]: I1216 08:16:21.356824 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 08:16:21 crc kubenswrapper[4789]: I1216 08:16:21.450528 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:21 crc kubenswrapper[4789]: I1216 08:16:21.450557 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:22 crc kubenswrapper[4789]: I1216 08:16:22.113671 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" path="/var/lib/kubelet/pods/272bbed9-b63b-4ecc-b85f-434030f27a80/volumes" Dec 16 08:16:23 crc kubenswrapper[4789]: I1216 08:16:23.353290 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:23 crc kubenswrapper[4789]: I1216 08:16:23.469656 4789 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 08:16:23 crc kubenswrapper[4789]: I1216 08:16:23.493644 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 08:16:29 crc kubenswrapper[4789]: I1216 08:16:29.909310 4789 scope.go:117] "RemoveContainer" containerID="81a5aadff1a0aa2385fad4d2d2f4e2ac647c997814200aaa34bd24ec9095f2ad" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.127156 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-355b-account-create-update-fc9kb"] Dec 16 08:16:31 crc kubenswrapper[4789]: E1216 08:16:31.127747 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerName="init" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.127760 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerName="init" Dec 16 08:16:31 crc kubenswrapper[4789]: E1216 08:16:31.127774 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerName="dnsmasq-dns" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.127780 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerName="dnsmasq-dns" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.127955 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="272bbed9-b63b-4ecc-b85f-434030f27a80" containerName="dnsmasq-dns" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.128472 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.130129 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.134059 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zpkbs"] Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.135420 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.148328 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-355b-account-create-update-fc9kb"] Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.169410 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zpkbs"] Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.233241 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkrdw\" (UniqueName: \"kubernetes.io/projected/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-kube-api-access-hkrdw\") pod \"placement-db-create-zpkbs\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.233285 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-operator-scripts\") pod \"placement-355b-account-create-update-fc9kb\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.233385 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-operator-scripts\") pod \"placement-db-create-zpkbs\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.233416 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqq5g\" (UniqueName: \"kubernetes.io/projected/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-kube-api-access-vqq5g\") pod \"placement-355b-account-create-update-fc9kb\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.334448 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkrdw\" (UniqueName: \"kubernetes.io/projected/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-kube-api-access-hkrdw\") pod \"placement-db-create-zpkbs\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.334771 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-operator-scripts\") pod \"placement-355b-account-create-update-fc9kb\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.334904 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-operator-scripts\") pod \"placement-db-create-zpkbs\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.335029 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqq5g\" (UniqueName: \"kubernetes.io/projected/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-kube-api-access-vqq5g\") pod \"placement-355b-account-create-update-fc9kb\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.335616 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-operator-scripts\") pod \"placement-355b-account-create-update-fc9kb\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.335813 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-operator-scripts\") pod \"placement-db-create-zpkbs\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.361048 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqq5g\" (UniqueName: \"kubernetes.io/projected/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-kube-api-access-vqq5g\") pod \"placement-355b-account-create-update-fc9kb\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.361048 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkrdw\" (UniqueName: \"kubernetes.io/projected/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-kube-api-access-hkrdw\") pod \"placement-db-create-zpkbs\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.447978 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.457039 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.901475 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zpkbs"] Dec 16 08:16:31 crc kubenswrapper[4789]: W1216 08:16:31.906540 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode70fb015_ac29_4a0e_a7ea_eb5cd4d9690f.slice/crio-31eda7ac1189344a1c76ca1285f690309e09379173e40456b97bf2bd429f8b9b WatchSource:0}: Error finding container 31eda7ac1189344a1c76ca1285f690309e09379173e40456b97bf2bd429f8b9b: Status 404 returned error can't find the container with id 31eda7ac1189344a1c76ca1285f690309e09379173e40456b97bf2bd429f8b9b Dec 16 08:16:31 crc kubenswrapper[4789]: I1216 08:16:31.984677 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-355b-account-create-update-fc9kb"] Dec 16 08:16:32 crc kubenswrapper[4789]: I1216 08:16:32.544478 4789 generic.go:334] "Generic (PLEG): container finished" podID="ac1e854f-a5fd-4cf1-9f74-dcef650d90b4" containerID="67a695f9157b0862a4d43a2b8b4fcba65384bcf7358897cbe3bd13363b23ef6e" exitCode=0 Dec 16 08:16:32 crc kubenswrapper[4789]: I1216 08:16:32.544650 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-355b-account-create-update-fc9kb" event={"ID":"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4","Type":"ContainerDied","Data":"67a695f9157b0862a4d43a2b8b4fcba65384bcf7358897cbe3bd13363b23ef6e"} Dec 16 08:16:32 crc kubenswrapper[4789]: I1216 08:16:32.544828 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-355b-account-create-update-fc9kb" event={"ID":"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4","Type":"ContainerStarted","Data":"e9244d6b7bc4f09dcf26be1f9dd053772b1111315c0dbbea17ee2120ea430f59"} Dec 16 08:16:32 crc kubenswrapper[4789]: I1216 08:16:32.546413 4789 generic.go:334] "Generic (PLEG): container finished" podID="e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f" containerID="36444dd3b3eb4e4d30df9516b4f9a42484f4c22695eaa6d5753167be5037a3cc" exitCode=0 Dec 16 08:16:32 crc kubenswrapper[4789]: I1216 08:16:32.546433 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zpkbs" event={"ID":"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f","Type":"ContainerDied","Data":"36444dd3b3eb4e4d30df9516b4f9a42484f4c22695eaa6d5753167be5037a3cc"} Dec 16 08:16:32 crc kubenswrapper[4789]: I1216 08:16:32.546447 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zpkbs" event={"ID":"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f","Type":"ContainerStarted","Data":"31eda7ac1189344a1c76ca1285f690309e09379173e40456b97bf2bd429f8b9b"} Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.938666 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.946597 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.987225 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqq5g\" (UniqueName: \"kubernetes.io/projected/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-kube-api-access-vqq5g\") pod \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.987598 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-operator-scripts\") pod \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\" (UID: \"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4\") " Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.987734 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkrdw\" (UniqueName: \"kubernetes.io/projected/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-kube-api-access-hkrdw\") pod \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.988012 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-operator-scripts\") pod \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\" (UID: \"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f\") " Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.988446 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac1e854f-a5fd-4cf1-9f74-dcef650d90b4" (UID: "ac1e854f-a5fd-4cf1-9f74-dcef650d90b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.988542 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f" (UID: "e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.993462 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-kube-api-access-vqq5g" (OuterVolumeSpecName: "kube-api-access-vqq5g") pod "ac1e854f-a5fd-4cf1-9f74-dcef650d90b4" (UID: "ac1e854f-a5fd-4cf1-9f74-dcef650d90b4"). InnerVolumeSpecName "kube-api-access-vqq5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:33 crc kubenswrapper[4789]: I1216 08:16:33.993858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-kube-api-access-hkrdw" (OuterVolumeSpecName: "kube-api-access-hkrdw") pod "e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f" (UID: "e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f"). InnerVolumeSpecName "kube-api-access-hkrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.089377 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkrdw\" (UniqueName: \"kubernetes.io/projected/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-kube-api-access-hkrdw\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.089409 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.089418 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqq5g\" (UniqueName: \"kubernetes.io/projected/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-kube-api-access-vqq5g\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.089427 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.564455 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-355b-account-create-update-fc9kb" event={"ID":"ac1e854f-a5fd-4cf1-9f74-dcef650d90b4","Type":"ContainerDied","Data":"e9244d6b7bc4f09dcf26be1f9dd053772b1111315c0dbbea17ee2120ea430f59"} Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.564489 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-355b-account-create-update-fc9kb" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.564501 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9244d6b7bc4f09dcf26be1f9dd053772b1111315c0dbbea17ee2120ea430f59" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.566110 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zpkbs" event={"ID":"e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f","Type":"ContainerDied","Data":"31eda7ac1189344a1c76ca1285f690309e09379173e40456b97bf2bd429f8b9b"} Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.566152 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31eda7ac1189344a1c76ca1285f690309e09379173e40456b97bf2bd429f8b9b" Dec 16 08:16:34 crc kubenswrapper[4789]: I1216 08:16:34.566251 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zpkbs" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.446585 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9c59c47c-dzbn6"] Dec 16 08:16:36 crc kubenswrapper[4789]: E1216 08:16:36.447888 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1e854f-a5fd-4cf1-9f74-dcef650d90b4" containerName="mariadb-account-create-update" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.447925 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1e854f-a5fd-4cf1-9f74-dcef650d90b4" containerName="mariadb-account-create-update" Dec 16 08:16:36 crc kubenswrapper[4789]: E1216 08:16:36.447950 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f" containerName="mariadb-database-create" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.447959 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f" containerName="mariadb-database-create" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.448165 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f" containerName="mariadb-database-create" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.448193 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1e854f-a5fd-4cf1-9f74-dcef650d90b4" containerName="mariadb-account-create-update" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.449300 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.465841 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c59c47c-dzbn6"] Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.490748 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gdnmc"] Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.493985 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.496926 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9jz26" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.497156 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.497264 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.512149 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gdnmc"] Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.636717 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-combined-ca-bundle\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637055 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-nb\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637097 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-sb\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637430 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-config\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637571 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-logs\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637707 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-dns-svc\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637757 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-scripts\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637895 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5db5\" (UniqueName: \"kubernetes.io/projected/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-kube-api-access-c5db5\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637967 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-config-data\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.637999 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7t2m\" (UniqueName: \"kubernetes.io/projected/23f4c5ef-0e8d-4939-b577-14b76d2ece57-kube-api-access-p7t2m\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739621 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-dns-svc\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739666 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-scripts\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739704 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5db5\" (UniqueName: \"kubernetes.io/projected/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-kube-api-access-c5db5\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739723 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-config-data\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739744 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7t2m\" (UniqueName: \"kubernetes.io/projected/23f4c5ef-0e8d-4939-b577-14b76d2ece57-kube-api-access-p7t2m\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739783 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-combined-ca-bundle\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739821 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-nb\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739847 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-sb\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739938 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-config\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.739963 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-logs\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.740389 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-logs\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.740641 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-dns-svc\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.740814 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-sb\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.740932 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-nb\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.740985 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-config\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.751137 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-combined-ca-bundle\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.753614 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-scripts\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.753890 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-config-data\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.756026 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5db5\" (UniqueName: \"kubernetes.io/projected/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-kube-api-access-c5db5\") pod \"placement-db-sync-gdnmc\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.760754 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7t2m\" (UniqueName: \"kubernetes.io/projected/23f4c5ef-0e8d-4939-b577-14b76d2ece57-kube-api-access-p7t2m\") pod \"dnsmasq-dns-9c59c47c-dzbn6\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.770762 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:36 crc kubenswrapper[4789]: I1216 08:16:36.808312 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:37 crc kubenswrapper[4789]: W1216 08:16:37.218998 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23f4c5ef_0e8d_4939_b577_14b76d2ece57.slice/crio-c05154acfe04cb21850f7ec6dc20c78d93dd21e053817507212400822c7805c1 WatchSource:0}: Error finding container c05154acfe04cb21850f7ec6dc20c78d93dd21e053817507212400822c7805c1: Status 404 returned error can't find the container with id c05154acfe04cb21850f7ec6dc20c78d93dd21e053817507212400822c7805c1 Dec 16 08:16:37 crc kubenswrapper[4789]: I1216 08:16:37.224225 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c59c47c-dzbn6"] Dec 16 08:16:37 crc kubenswrapper[4789]: I1216 08:16:37.304172 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gdnmc"] Dec 16 08:16:37 crc kubenswrapper[4789]: W1216 08:16:37.308449 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode06be30e_36e2_466a_9caa_2cd41c0f4bb6.slice/crio-da51e5fb0452198132b34a38d00a5fe6b11385f8d327b90c2f67eeabdebb65f3 WatchSource:0}: Error finding container da51e5fb0452198132b34a38d00a5fe6b11385f8d327b90c2f67eeabdebb65f3: Status 404 returned error can't find the container with id da51e5fb0452198132b34a38d00a5fe6b11385f8d327b90c2f67eeabdebb65f3 Dec 16 08:16:37 crc kubenswrapper[4789]: I1216 08:16:37.597483 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdnmc" event={"ID":"e06be30e-36e2-466a-9caa-2cd41c0f4bb6","Type":"ContainerStarted","Data":"da51e5fb0452198132b34a38d00a5fe6b11385f8d327b90c2f67eeabdebb65f3"} Dec 16 08:16:37 crc kubenswrapper[4789]: I1216 08:16:37.601544 4789 generic.go:334] "Generic (PLEG): container finished" podID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerID="1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34" exitCode=0 Dec 16 08:16:37 crc kubenswrapper[4789]: I1216 08:16:37.601579 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" event={"ID":"23f4c5ef-0e8d-4939-b577-14b76d2ece57","Type":"ContainerDied","Data":"1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34"} Dec 16 08:16:37 crc kubenswrapper[4789]: I1216 08:16:37.601825 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" event={"ID":"23f4c5ef-0e8d-4939-b577-14b76d2ece57","Type":"ContainerStarted","Data":"c05154acfe04cb21850f7ec6dc20c78d93dd21e053817507212400822c7805c1"} Dec 16 08:16:38 crc kubenswrapper[4789]: I1216 08:16:38.611388 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" event={"ID":"23f4c5ef-0e8d-4939-b577-14b76d2ece57","Type":"ContainerStarted","Data":"3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e"} Dec 16 08:16:38 crc kubenswrapper[4789]: I1216 08:16:38.611839 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:38 crc kubenswrapper[4789]: I1216 08:16:38.636647 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" podStartSLOduration=2.636622595 podStartE2EDuration="2.636622595s" podCreationTimestamp="2025-12-16 08:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:16:38.629121582 +0000 UTC m=+5136.891009231" watchObservedRunningTime="2025-12-16 08:16:38.636622595 +0000 UTC m=+5136.898510244" Dec 16 08:16:41 crc kubenswrapper[4789]: I1216 08:16:41.638214 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdnmc" event={"ID":"e06be30e-36e2-466a-9caa-2cd41c0f4bb6","Type":"ContainerStarted","Data":"687251f59a32c12d11eb156d5811c1201ccc61fb3cded4b501886f54720185b3"} Dec 16 08:16:42 crc kubenswrapper[4789]: I1216 08:16:42.652784 4789 generic.go:334] "Generic (PLEG): container finished" podID="e06be30e-36e2-466a-9caa-2cd41c0f4bb6" containerID="687251f59a32c12d11eb156d5811c1201ccc61fb3cded4b501886f54720185b3" exitCode=0 Dec 16 08:16:42 crc kubenswrapper[4789]: I1216 08:16:42.652965 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdnmc" event={"ID":"e06be30e-36e2-466a-9caa-2cd41c0f4bb6","Type":"ContainerDied","Data":"687251f59a32c12d11eb156d5811c1201ccc61fb3cded4b501886f54720185b3"} Dec 16 08:16:43 crc kubenswrapper[4789]: I1216 08:16:43.995969 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.184520 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5db5\" (UniqueName: \"kubernetes.io/projected/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-kube-api-access-c5db5\") pod \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.184681 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-combined-ca-bundle\") pod \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.185136 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-scripts\") pod \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.185166 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-config-data\") pod \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.185302 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-logs\") pod \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\" (UID: \"e06be30e-36e2-466a-9caa-2cd41c0f4bb6\") " Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.186092 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-logs" (OuterVolumeSpecName: "logs") pod "e06be30e-36e2-466a-9caa-2cd41c0f4bb6" (UID: "e06be30e-36e2-466a-9caa-2cd41c0f4bb6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.186468 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.190649 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-scripts" (OuterVolumeSpecName: "scripts") pod "e06be30e-36e2-466a-9caa-2cd41c0f4bb6" (UID: "e06be30e-36e2-466a-9caa-2cd41c0f4bb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.191514 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-kube-api-access-c5db5" (OuterVolumeSpecName: "kube-api-access-c5db5") pod "e06be30e-36e2-466a-9caa-2cd41c0f4bb6" (UID: "e06be30e-36e2-466a-9caa-2cd41c0f4bb6"). InnerVolumeSpecName "kube-api-access-c5db5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.209084 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-config-data" (OuterVolumeSpecName: "config-data") pod "e06be30e-36e2-466a-9caa-2cd41c0f4bb6" (UID: "e06be30e-36e2-466a-9caa-2cd41c0f4bb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.209613 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e06be30e-36e2-466a-9caa-2cd41c0f4bb6" (UID: "e06be30e-36e2-466a-9caa-2cd41c0f4bb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.288885 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.289414 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.289532 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5db5\" (UniqueName: \"kubernetes.io/projected/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-kube-api-access-c5db5\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.289630 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e06be30e-36e2-466a-9caa-2cd41c0f4bb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.678305 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdnmc" event={"ID":"e06be30e-36e2-466a-9caa-2cd41c0f4bb6","Type":"ContainerDied","Data":"da51e5fb0452198132b34a38d00a5fe6b11385f8d327b90c2f67eeabdebb65f3"} Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.678346 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da51e5fb0452198132b34a38d00a5fe6b11385f8d327b90c2f67eeabdebb65f3" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.678365 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdnmc" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.759077 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-86f8f549f6-wsg8b"] Dec 16 08:16:44 crc kubenswrapper[4789]: E1216 08:16:44.759791 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06be30e-36e2-466a-9caa-2cd41c0f4bb6" containerName="placement-db-sync" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.759814 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06be30e-36e2-466a-9caa-2cd41c0f4bb6" containerName="placement-db-sync" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.760115 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06be30e-36e2-466a-9caa-2cd41c0f4bb6" containerName="placement-db-sync" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.761693 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.768210 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.768429 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.768736 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9jz26" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.797057 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86f8f549f6-wsg8b"] Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.797549 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-config-data\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.797685 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedac820-75c0-4fe8-865d-39225c3f8b09-logs\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.797739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-combined-ca-bundle\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.797786 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-scripts\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.797826 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9d4r\" (UniqueName: \"kubernetes.io/projected/aedac820-75c0-4fe8-865d-39225c3f8b09-kube-api-access-q9d4r\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.899896 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-config-data\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.900102 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedac820-75c0-4fe8-865d-39225c3f8b09-logs\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.900182 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-combined-ca-bundle\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.900253 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-scripts\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.900311 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9d4r\" (UniqueName: \"kubernetes.io/projected/aedac820-75c0-4fe8-865d-39225c3f8b09-kube-api-access-q9d4r\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.900564 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedac820-75c0-4fe8-865d-39225c3f8b09-logs\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.903510 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-scripts\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.903812 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-combined-ca-bundle\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.907559 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedac820-75c0-4fe8-865d-39225c3f8b09-config-data\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:44 crc kubenswrapper[4789]: I1216 08:16:44.916276 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9d4r\" (UniqueName: \"kubernetes.io/projected/aedac820-75c0-4fe8-865d-39225c3f8b09-kube-api-access-q9d4r\") pod \"placement-86f8f549f6-wsg8b\" (UID: \"aedac820-75c0-4fe8-865d-39225c3f8b09\") " pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:45 crc kubenswrapper[4789]: I1216 08:16:45.103603 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:45 crc kubenswrapper[4789]: I1216 08:16:45.583844 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86f8f549f6-wsg8b"] Dec 16 08:16:45 crc kubenswrapper[4789]: I1216 08:16:45.689592 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86f8f549f6-wsg8b" event={"ID":"aedac820-75c0-4fe8-865d-39225c3f8b09","Type":"ContainerStarted","Data":"a736502904e4521e79b9fffcf67f6b051e790e0ac3d9c314cd68f1136e6a6757"} Dec 16 08:16:46 crc kubenswrapper[4789]: I1216 08:16:46.698788 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86f8f549f6-wsg8b" event={"ID":"aedac820-75c0-4fe8-865d-39225c3f8b09","Type":"ContainerStarted","Data":"93b2a554138a7dc208c4954c0b01f7146ae3cb01af2e80763325492cc36884c7"} Dec 16 08:16:46 crc kubenswrapper[4789]: I1216 08:16:46.698847 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86f8f549f6-wsg8b" event={"ID":"aedac820-75c0-4fe8-865d-39225c3f8b09","Type":"ContainerStarted","Data":"ca877cf6eb8636d77afb945ea602f6722b3388ca81c9c312e9d1bf9461f7b079"} Dec 16 08:16:46 crc kubenswrapper[4789]: I1216 08:16:46.698941 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:46 crc kubenswrapper[4789]: I1216 08:16:46.725774 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-86f8f549f6-wsg8b" podStartSLOduration=2.725748076 podStartE2EDuration="2.725748076s" podCreationTimestamp="2025-12-16 08:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:16:46.719089744 +0000 UTC m=+5144.980977373" watchObservedRunningTime="2025-12-16 08:16:46.725748076 +0000 UTC m=+5144.987635705" Dec 16 08:16:46 crc kubenswrapper[4789]: I1216 08:16:46.772130 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:16:46 crc kubenswrapper[4789]: I1216 08:16:46.876271 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869b597f9f-w6kfg"] Dec 16 08:16:46 crc kubenswrapper[4789]: I1216 08:16:46.876579 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" podUID="b32cd691-9873-4057-9f9a-15b65c718cba" containerName="dnsmasq-dns" containerID="cri-o://12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9" gracePeriod=10 Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.373289 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.451053 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-nb\") pod \"b32cd691-9873-4057-9f9a-15b65c718cba\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.451094 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdp42\" (UniqueName: \"kubernetes.io/projected/b32cd691-9873-4057-9f9a-15b65c718cba-kube-api-access-qdp42\") pod \"b32cd691-9873-4057-9f9a-15b65c718cba\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.451152 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-config\") pod \"b32cd691-9873-4057-9f9a-15b65c718cba\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.451208 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-dns-svc\") pod \"b32cd691-9873-4057-9f9a-15b65c718cba\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.451238 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-sb\") pod \"b32cd691-9873-4057-9f9a-15b65c718cba\" (UID: \"b32cd691-9873-4057-9f9a-15b65c718cba\") " Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.473192 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32cd691-9873-4057-9f9a-15b65c718cba-kube-api-access-qdp42" (OuterVolumeSpecName: "kube-api-access-qdp42") pod "b32cd691-9873-4057-9f9a-15b65c718cba" (UID: "b32cd691-9873-4057-9f9a-15b65c718cba"). InnerVolumeSpecName "kube-api-access-qdp42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.503760 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b32cd691-9873-4057-9f9a-15b65c718cba" (UID: "b32cd691-9873-4057-9f9a-15b65c718cba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.507906 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b32cd691-9873-4057-9f9a-15b65c718cba" (UID: "b32cd691-9873-4057-9f9a-15b65c718cba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.510526 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b32cd691-9873-4057-9f9a-15b65c718cba" (UID: "b32cd691-9873-4057-9f9a-15b65c718cba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.511446 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-config" (OuterVolumeSpecName: "config") pod "b32cd691-9873-4057-9f9a-15b65c718cba" (UID: "b32cd691-9873-4057-9f9a-15b65c718cba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.554655 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.554699 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.554714 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.554731 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32cd691-9873-4057-9f9a-15b65c718cba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.554743 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdp42\" (UniqueName: \"kubernetes.io/projected/b32cd691-9873-4057-9f9a-15b65c718cba-kube-api-access-qdp42\") on node \"crc\" DevicePath \"\"" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.731297 4789 generic.go:334] "Generic (PLEG): container finished" podID="b32cd691-9873-4057-9f9a-15b65c718cba" containerID="12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9" exitCode=0 Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.732949 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.733072 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" event={"ID":"b32cd691-9873-4057-9f9a-15b65c718cba","Type":"ContainerDied","Data":"12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9"} Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.733147 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b597f9f-w6kfg" event={"ID":"b32cd691-9873-4057-9f9a-15b65c718cba","Type":"ContainerDied","Data":"8f9af017b864efb29a9c6fe83596c5ea4260c94db1038ce60ca00063a712a27c"} Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.733178 4789 scope.go:117] "RemoveContainer" containerID="12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.733563 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.779704 4789 scope.go:117] "RemoveContainer" containerID="66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.795738 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869b597f9f-w6kfg"] Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.800373 4789 scope.go:117] "RemoveContainer" containerID="12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9" Dec 16 08:16:47 crc kubenswrapper[4789]: E1216 08:16:47.800861 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9\": container with ID starting with 12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9 not found: ID does not exist" containerID="12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.800904 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9"} err="failed to get container status \"12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9\": rpc error: code = NotFound desc = could not find container \"12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9\": container with ID starting with 12c9f9e1800ac2237841180a18fe18e6a1118d795a4f5c725e982e580386adf9 not found: ID does not exist" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.801358 4789 scope.go:117] "RemoveContainer" containerID="66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80" Dec 16 08:16:47 crc kubenswrapper[4789]: E1216 08:16:47.801764 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80\": container with ID starting with 66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80 not found: ID does not exist" containerID="66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.801798 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80"} err="failed to get container status \"66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80\": rpc error: code = NotFound desc = could not find container \"66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80\": container with ID starting with 66c3758103b470eb3e1c02d8958d9db7d235da0f0897870ce016f6c943db5d80 not found: ID does not exist" Dec 16 08:16:47 crc kubenswrapper[4789]: I1216 08:16:47.806990 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869b597f9f-w6kfg"] Dec 16 08:16:48 crc kubenswrapper[4789]: I1216 08:16:48.117455 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32cd691-9873-4057-9f9a-15b65c718cba" path="/var/lib/kubelet/pods/b32cd691-9873-4057-9f9a-15b65c718cba/volumes" Dec 16 08:17:16 crc kubenswrapper[4789]: I1216 08:17:16.138078 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:17:16 crc kubenswrapper[4789]: I1216 08:17:16.139441 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-86f8f549f6-wsg8b" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.434996 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wh2l7"] Dec 16 08:17:39 crc kubenswrapper[4789]: E1216 08:17:39.435699 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32cd691-9873-4057-9f9a-15b65c718cba" containerName="dnsmasq-dns" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.435711 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32cd691-9873-4057-9f9a-15b65c718cba" containerName="dnsmasq-dns" Dec 16 08:17:39 crc kubenswrapper[4789]: E1216 08:17:39.435726 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32cd691-9873-4057-9f9a-15b65c718cba" containerName="init" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.435732 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32cd691-9873-4057-9f9a-15b65c718cba" containerName="init" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.435871 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32cd691-9873-4057-9f9a-15b65c718cba" containerName="dnsmasq-dns" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.436386 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.448699 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wh2l7"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.518238 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5p5l\" (UniqueName: \"kubernetes.io/projected/7d072154-82fc-4258-943f-1900efa7273c-kube-api-access-w5p5l\") pod \"nova-api-db-create-wh2l7\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.518485 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d072154-82fc-4258-943f-1900efa7273c-operator-scripts\") pod \"nova-api-db-create-wh2l7\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.524714 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9r2s9"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.526010 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.536025 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9r2s9"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.620382 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5p5l\" (UniqueName: \"kubernetes.io/projected/7d072154-82fc-4258-943f-1900efa7273c-kube-api-access-w5p5l\") pod \"nova-api-db-create-wh2l7\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.620473 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4ss\" (UniqueName: \"kubernetes.io/projected/83111890-5086-4173-a090-084f8d14334e-kube-api-access-gd4ss\") pod \"nova-cell0-db-create-9r2s9\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.620523 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d072154-82fc-4258-943f-1900efa7273c-operator-scripts\") pod \"nova-api-db-create-wh2l7\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.620586 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83111890-5086-4173-a090-084f8d14334e-operator-scripts\") pod \"nova-cell0-db-create-9r2s9\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.621444 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d072154-82fc-4258-943f-1900efa7273c-operator-scripts\") pod \"nova-api-db-create-wh2l7\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.631445 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fxgvr"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.632642 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.642059 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d3bb-account-create-update-pmmb7"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.643148 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.646201 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.646577 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5p5l\" (UniqueName: \"kubernetes.io/projected/7d072154-82fc-4258-943f-1900efa7273c-kube-api-access-w5p5l\") pod \"nova-api-db-create-wh2l7\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.654718 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fxgvr"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.669557 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d3bb-account-create-update-pmmb7"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.727546 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4ss\" (UniqueName: \"kubernetes.io/projected/83111890-5086-4173-a090-084f8d14334e-kube-api-access-gd4ss\") pod \"nova-cell0-db-create-9r2s9\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.727684 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cn2t\" (UniqueName: \"kubernetes.io/projected/805906ee-0f3c-48a2-bbe5-c294c6299888-kube-api-access-8cn2t\") pod \"nova-api-d3bb-account-create-update-pmmb7\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.727727 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83111890-5086-4173-a090-084f8d14334e-operator-scripts\") pod \"nova-cell0-db-create-9r2s9\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.727744 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805906ee-0f3c-48a2-bbe5-c294c6299888-operator-scripts\") pod \"nova-api-d3bb-account-create-update-pmmb7\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.727896 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a474cd62-d32b-4059-bacc-f878b03ffbfb-operator-scripts\") pod \"nova-cell1-db-create-fxgvr\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.728053 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn8mx\" (UniqueName: \"kubernetes.io/projected/a474cd62-d32b-4059-bacc-f878b03ffbfb-kube-api-access-qn8mx\") pod \"nova-cell1-db-create-fxgvr\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.729224 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83111890-5086-4173-a090-084f8d14334e-operator-scripts\") pod \"nova-cell0-db-create-9r2s9\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.748437 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4ss\" (UniqueName: \"kubernetes.io/projected/83111890-5086-4173-a090-084f8d14334e-kube-api-access-gd4ss\") pod \"nova-cell0-db-create-9r2s9\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.753465 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.833417 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn8mx\" (UniqueName: \"kubernetes.io/projected/a474cd62-d32b-4059-bacc-f878b03ffbfb-kube-api-access-qn8mx\") pod \"nova-cell1-db-create-fxgvr\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.833938 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cn2t\" (UniqueName: \"kubernetes.io/projected/805906ee-0f3c-48a2-bbe5-c294c6299888-kube-api-access-8cn2t\") pod \"nova-api-d3bb-account-create-update-pmmb7\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.833987 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805906ee-0f3c-48a2-bbe5-c294c6299888-operator-scripts\") pod \"nova-api-d3bb-account-create-update-pmmb7\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.834035 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a474cd62-d32b-4059-bacc-f878b03ffbfb-operator-scripts\") pod \"nova-cell1-db-create-fxgvr\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.835006 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a474cd62-d32b-4059-bacc-f878b03ffbfb-operator-scripts\") pod \"nova-cell1-db-create-fxgvr\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.836079 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805906ee-0f3c-48a2-bbe5-c294c6299888-operator-scripts\") pod \"nova-api-d3bb-account-create-update-pmmb7\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.841425 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.848374 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2aee-account-create-update-kzmtw"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.849925 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.854169 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.855372 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cn2t\" (UniqueName: \"kubernetes.io/projected/805906ee-0f3c-48a2-bbe5-c294c6299888-kube-api-access-8cn2t\") pod \"nova-api-d3bb-account-create-update-pmmb7\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.871274 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn8mx\" (UniqueName: \"kubernetes.io/projected/a474cd62-d32b-4059-bacc-f878b03ffbfb-kube-api-access-qn8mx\") pod \"nova-cell1-db-create-fxgvr\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.871519 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2aee-account-create-update-kzmtw"] Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.935239 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07ce2586-6062-48bc-a867-37d682d9b3b1-operator-scripts\") pod \"nova-cell0-2aee-account-create-update-kzmtw\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.935283 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp7g2\" (UniqueName: \"kubernetes.io/projected/07ce2586-6062-48bc-a867-37d682d9b3b1-kube-api-access-pp7g2\") pod \"nova-cell0-2aee-account-create-update-kzmtw\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:39 crc kubenswrapper[4789]: I1216 08:17:39.990293 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.002552 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.035451 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3c4f-account-create-update-kqkc6"] Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.036403 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp7g2\" (UniqueName: \"kubernetes.io/projected/07ce2586-6062-48bc-a867-37d682d9b3b1-kube-api-access-pp7g2\") pod \"nova-cell0-2aee-account-create-update-kzmtw\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.036563 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07ce2586-6062-48bc-a867-37d682d9b3b1-operator-scripts\") pod \"nova-cell0-2aee-account-create-update-kzmtw\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.037105 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.037309 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07ce2586-6062-48bc-a867-37d682d9b3b1-operator-scripts\") pod \"nova-cell0-2aee-account-create-update-kzmtw\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.040128 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.044660 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3c4f-account-create-update-kqkc6"] Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.071288 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp7g2\" (UniqueName: \"kubernetes.io/projected/07ce2586-6062-48bc-a867-37d682d9b3b1-kube-api-access-pp7g2\") pod \"nova-cell0-2aee-account-create-update-kzmtw\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.139595 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp54w\" (UniqueName: \"kubernetes.io/projected/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-kube-api-access-fp54w\") pod \"nova-cell1-3c4f-account-create-update-kqkc6\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.139656 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-operator-scripts\") pod \"nova-cell1-3c4f-account-create-update-kqkc6\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.241030 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.241777 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp54w\" (UniqueName: \"kubernetes.io/projected/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-kube-api-access-fp54w\") pod \"nova-cell1-3c4f-account-create-update-kqkc6\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.241870 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-operator-scripts\") pod \"nova-cell1-3c4f-account-create-update-kqkc6\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.242888 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-operator-scripts\") pod \"nova-cell1-3c4f-account-create-update-kqkc6\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.251118 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wh2l7"] Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.265353 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp54w\" (UniqueName: \"kubernetes.io/projected/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-kube-api-access-fp54w\") pod \"nova-cell1-3c4f-account-create-update-kqkc6\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.356539 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.452958 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9r2s9"] Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.558283 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fxgvr"] Dec 16 08:17:40 crc kubenswrapper[4789]: W1216 08:17:40.572931 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda474cd62_d32b_4059_bacc_f878b03ffbfb.slice/crio-d9192da76ee217a059e10f989fe969ffc6199c4e93f8127c79eed6a61140a28b WatchSource:0}: Error finding container d9192da76ee217a059e10f989fe969ffc6199c4e93f8127c79eed6a61140a28b: Status 404 returned error can't find the container with id d9192da76ee217a059e10f989fe969ffc6199c4e93f8127c79eed6a61140a28b Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.603012 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d3bb-account-create-update-pmmb7"] Dec 16 08:17:40 crc kubenswrapper[4789]: W1216 08:17:40.630243 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805906ee_0f3c_48a2_bbe5_c294c6299888.slice/crio-c86478dcf63455997c0bfd1cf31134f931c14ac4c457cacc74daf68e318adc85 WatchSource:0}: Error finding container c86478dcf63455997c0bfd1cf31134f931c14ac4c457cacc74daf68e318adc85: Status 404 returned error can't find the container with id c86478dcf63455997c0bfd1cf31134f931c14ac4c457cacc74daf68e318adc85 Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.766428 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2aee-account-create-update-kzmtw"] Dec 16 08:17:40 crc kubenswrapper[4789]: I1216 08:17:40.926443 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3c4f-account-create-update-kqkc6"] Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.190363 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" event={"ID":"07ce2586-6062-48bc-a867-37d682d9b3b1","Type":"ContainerStarted","Data":"12087f0bc2306459f2239259278194efb105974ffa8f01d1ee4782a5e25efb29"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.190416 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" event={"ID":"07ce2586-6062-48bc-a867-37d682d9b3b1","Type":"ContainerStarted","Data":"6030152c843f1b2cd25aaade292f1a9d23a6d250285e05e60128dcd5d0355968"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.192151 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" event={"ID":"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0","Type":"ContainerStarted","Data":"2417fc65e2b0cf426aaceb5a0880c40bd7ffcbc8e94f73929c6af828f70f2b4e"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.192190 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" event={"ID":"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0","Type":"ContainerStarted","Data":"b82d0e004cff215889ed89c7fdcfdcd8184947a9957059020bc1a5034b1a996a"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.193687 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" event={"ID":"805906ee-0f3c-48a2-bbe5-c294c6299888","Type":"ContainerStarted","Data":"a0e523e6d7bdbac1a4c437a0b340c7e619fe2645c5eb32c8f21e649463b3fe55"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.193719 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" event={"ID":"805906ee-0f3c-48a2-bbe5-c294c6299888","Type":"ContainerStarted","Data":"c86478dcf63455997c0bfd1cf31134f931c14ac4c457cacc74daf68e318adc85"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.195308 4789 generic.go:334] "Generic (PLEG): container finished" podID="7d072154-82fc-4258-943f-1900efa7273c" containerID="7f0ac1988ec8a507c83102a5b9ed1eed69eeff1319a0336ab5303bfd8a69c8b4" exitCode=0 Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.195351 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wh2l7" event={"ID":"7d072154-82fc-4258-943f-1900efa7273c","Type":"ContainerDied","Data":"7f0ac1988ec8a507c83102a5b9ed1eed69eeff1319a0336ab5303bfd8a69c8b4"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.195385 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wh2l7" event={"ID":"7d072154-82fc-4258-943f-1900efa7273c","Type":"ContainerStarted","Data":"35a96d55706a753965d5436bb7c72f53743dfeebbb0aa10557000d6a7afde884"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.196821 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fxgvr" event={"ID":"a474cd62-d32b-4059-bacc-f878b03ffbfb","Type":"ContainerStarted","Data":"49c0000cc884030bd91263ba067e1c0efe9e9030220903e1b24a1bc129940433"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.196844 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fxgvr" event={"ID":"a474cd62-d32b-4059-bacc-f878b03ffbfb","Type":"ContainerStarted","Data":"d9192da76ee217a059e10f989fe969ffc6199c4e93f8127c79eed6a61140a28b"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.198291 4789 generic.go:334] "Generic (PLEG): container finished" podID="83111890-5086-4173-a090-084f8d14334e" containerID="62e3c8a1bf1ff34a19a392cfbd88506cbd96e8d50f3c63eebfacdb83356de130" exitCode=0 Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.198326 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9r2s9" event={"ID":"83111890-5086-4173-a090-084f8d14334e","Type":"ContainerDied","Data":"62e3c8a1bf1ff34a19a392cfbd88506cbd96e8d50f3c63eebfacdb83356de130"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.198356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9r2s9" event={"ID":"83111890-5086-4173-a090-084f8d14334e","Type":"ContainerStarted","Data":"8f82cd25ee79f5fd5e28f9c2f75a1518bef4e2b6fb8484e08e2faed7089b6521"} Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.211656 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" podStartSLOduration=2.211640603 podStartE2EDuration="2.211640603s" podCreationTimestamp="2025-12-16 08:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:17:41.208473266 +0000 UTC m=+5199.470360895" watchObservedRunningTime="2025-12-16 08:17:41.211640603 +0000 UTC m=+5199.473528232" Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.251322 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" podStartSLOduration=1.251304863 podStartE2EDuration="1.251304863s" podCreationTimestamp="2025-12-16 08:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:17:41.2450367 +0000 UTC m=+5199.506924329" watchObservedRunningTime="2025-12-16 08:17:41.251304863 +0000 UTC m=+5199.513192492" Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.267681 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" podStartSLOduration=2.267663543 podStartE2EDuration="2.267663543s" podCreationTimestamp="2025-12-16 08:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:17:41.262843915 +0000 UTC m=+5199.524731554" watchObservedRunningTime="2025-12-16 08:17:41.267663543 +0000 UTC m=+5199.529551172" Dec 16 08:17:41 crc kubenswrapper[4789]: I1216 08:17:41.283615 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-fxgvr" podStartSLOduration=2.283593462 podStartE2EDuration="2.283593462s" podCreationTimestamp="2025-12-16 08:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:17:41.275133575 +0000 UTC m=+5199.537021204" watchObservedRunningTime="2025-12-16 08:17:41.283593462 +0000 UTC m=+5199.545481091" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.206483 4789 generic.go:334] "Generic (PLEG): container finished" podID="07ce2586-6062-48bc-a867-37d682d9b3b1" containerID="12087f0bc2306459f2239259278194efb105974ffa8f01d1ee4782a5e25efb29" exitCode=0 Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.207066 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" event={"ID":"07ce2586-6062-48bc-a867-37d682d9b3b1","Type":"ContainerDied","Data":"12087f0bc2306459f2239259278194efb105974ffa8f01d1ee4782a5e25efb29"} Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.209898 4789 generic.go:334] "Generic (PLEG): container finished" podID="2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0" containerID="2417fc65e2b0cf426aaceb5a0880c40bd7ffcbc8e94f73929c6af828f70f2b4e" exitCode=0 Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.210069 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" event={"ID":"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0","Type":"ContainerDied","Data":"2417fc65e2b0cf426aaceb5a0880c40bd7ffcbc8e94f73929c6af828f70f2b4e"} Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.212165 4789 generic.go:334] "Generic (PLEG): container finished" podID="805906ee-0f3c-48a2-bbe5-c294c6299888" containerID="a0e523e6d7bdbac1a4c437a0b340c7e619fe2645c5eb32c8f21e649463b3fe55" exitCode=0 Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.212257 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" event={"ID":"805906ee-0f3c-48a2-bbe5-c294c6299888","Type":"ContainerDied","Data":"a0e523e6d7bdbac1a4c437a0b340c7e619fe2645c5eb32c8f21e649463b3fe55"} Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.213621 4789 generic.go:334] "Generic (PLEG): container finished" podID="a474cd62-d32b-4059-bacc-f878b03ffbfb" containerID="49c0000cc884030bd91263ba067e1c0efe9e9030220903e1b24a1bc129940433" exitCode=0 Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.213817 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fxgvr" event={"ID":"a474cd62-d32b-4059-bacc-f878b03ffbfb","Type":"ContainerDied","Data":"49c0000cc884030bd91263ba067e1c0efe9e9030220903e1b24a1bc129940433"} Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.589881 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.608867 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.714584 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5p5l\" (UniqueName: \"kubernetes.io/projected/7d072154-82fc-4258-943f-1900efa7273c-kube-api-access-w5p5l\") pod \"7d072154-82fc-4258-943f-1900efa7273c\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.715094 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83111890-5086-4173-a090-084f8d14334e-operator-scripts\") pod \"83111890-5086-4173-a090-084f8d14334e\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.715219 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4ss\" (UniqueName: \"kubernetes.io/projected/83111890-5086-4173-a090-084f8d14334e-kube-api-access-gd4ss\") pod \"83111890-5086-4173-a090-084f8d14334e\" (UID: \"83111890-5086-4173-a090-084f8d14334e\") " Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.715333 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d072154-82fc-4258-943f-1900efa7273c-operator-scripts\") pod \"7d072154-82fc-4258-943f-1900efa7273c\" (UID: \"7d072154-82fc-4258-943f-1900efa7273c\") " Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.715621 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83111890-5086-4173-a090-084f8d14334e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83111890-5086-4173-a090-084f8d14334e" (UID: "83111890-5086-4173-a090-084f8d14334e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.715821 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83111890-5086-4173-a090-084f8d14334e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.716146 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d072154-82fc-4258-943f-1900efa7273c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d072154-82fc-4258-943f-1900efa7273c" (UID: "7d072154-82fc-4258-943f-1900efa7273c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.721319 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d072154-82fc-4258-943f-1900efa7273c-kube-api-access-w5p5l" (OuterVolumeSpecName: "kube-api-access-w5p5l") pod "7d072154-82fc-4258-943f-1900efa7273c" (UID: "7d072154-82fc-4258-943f-1900efa7273c"). InnerVolumeSpecName "kube-api-access-w5p5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.721568 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83111890-5086-4173-a090-084f8d14334e-kube-api-access-gd4ss" (OuterVolumeSpecName: "kube-api-access-gd4ss") pod "83111890-5086-4173-a090-084f8d14334e" (UID: "83111890-5086-4173-a090-084f8d14334e"). InnerVolumeSpecName "kube-api-access-gd4ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.817269 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5p5l\" (UniqueName: \"kubernetes.io/projected/7d072154-82fc-4258-943f-1900efa7273c-kube-api-access-w5p5l\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.817697 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd4ss\" (UniqueName: \"kubernetes.io/projected/83111890-5086-4173-a090-084f8d14334e-kube-api-access-gd4ss\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:42 crc kubenswrapper[4789]: I1216 08:17:42.817712 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d072154-82fc-4258-943f-1900efa7273c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.231215 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9r2s9" event={"ID":"83111890-5086-4173-a090-084f8d14334e","Type":"ContainerDied","Data":"8f82cd25ee79f5fd5e28f9c2f75a1518bef4e2b6fb8484e08e2faed7089b6521"} Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.231997 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f82cd25ee79f5fd5e28f9c2f75a1518bef4e2b6fb8484e08e2faed7089b6521" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.231282 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9r2s9" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.236553 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wh2l7" event={"ID":"7d072154-82fc-4258-943f-1900efa7273c","Type":"ContainerDied","Data":"35a96d55706a753965d5436bb7c72f53743dfeebbb0aa10557000d6a7afde884"} Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.236607 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a96d55706a753965d5436bb7c72f53743dfeebbb0aa10557000d6a7afde884" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.236570 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wh2l7" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.594678 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.686979 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.692767 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.708824 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.737994 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-operator-scripts\") pod \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.738209 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp54w\" (UniqueName: \"kubernetes.io/projected/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-kube-api-access-fp54w\") pod \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\" (UID: \"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.739524 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0" (UID: "2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.749869 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-kube-api-access-fp54w" (OuterVolumeSpecName: "kube-api-access-fp54w") pod "2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0" (UID: "2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0"). InnerVolumeSpecName "kube-api-access-fp54w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.840331 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn8mx\" (UniqueName: \"kubernetes.io/projected/a474cd62-d32b-4059-bacc-f878b03ffbfb-kube-api-access-qn8mx\") pod \"a474cd62-d32b-4059-bacc-f878b03ffbfb\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.840598 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a474cd62-d32b-4059-bacc-f878b03ffbfb-operator-scripts\") pod \"a474cd62-d32b-4059-bacc-f878b03ffbfb\" (UID: \"a474cd62-d32b-4059-bacc-f878b03ffbfb\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.840707 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cn2t\" (UniqueName: \"kubernetes.io/projected/805906ee-0f3c-48a2-bbe5-c294c6299888-kube-api-access-8cn2t\") pod \"805906ee-0f3c-48a2-bbe5-c294c6299888\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.840870 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07ce2586-6062-48bc-a867-37d682d9b3b1-operator-scripts\") pod \"07ce2586-6062-48bc-a867-37d682d9b3b1\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841062 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a474cd62-d32b-4059-bacc-f878b03ffbfb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a474cd62-d32b-4059-bacc-f878b03ffbfb" (UID: "a474cd62-d32b-4059-bacc-f878b03ffbfb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841172 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp7g2\" (UniqueName: \"kubernetes.io/projected/07ce2586-6062-48bc-a867-37d682d9b3b1-kube-api-access-pp7g2\") pod \"07ce2586-6062-48bc-a867-37d682d9b3b1\" (UID: \"07ce2586-6062-48bc-a867-37d682d9b3b1\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841282 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805906ee-0f3c-48a2-bbe5-c294c6299888-operator-scripts\") pod \"805906ee-0f3c-48a2-bbe5-c294c6299888\" (UID: \"805906ee-0f3c-48a2-bbe5-c294c6299888\") " Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841589 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841671 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a474cd62-d32b-4059-bacc-f878b03ffbfb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841741 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp54w\" (UniqueName: \"kubernetes.io/projected/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0-kube-api-access-fp54w\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841682 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805906ee-0f3c-48a2-bbe5-c294c6299888-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "805906ee-0f3c-48a2-bbe5-c294c6299888" (UID: "805906ee-0f3c-48a2-bbe5-c294c6299888"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.841967 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07ce2586-6062-48bc-a867-37d682d9b3b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07ce2586-6062-48bc-a867-37d682d9b3b1" (UID: "07ce2586-6062-48bc-a867-37d682d9b3b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.843090 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a474cd62-d32b-4059-bacc-f878b03ffbfb-kube-api-access-qn8mx" (OuterVolumeSpecName: "kube-api-access-qn8mx") pod "a474cd62-d32b-4059-bacc-f878b03ffbfb" (UID: "a474cd62-d32b-4059-bacc-f878b03ffbfb"). InnerVolumeSpecName "kube-api-access-qn8mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.843376 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ce2586-6062-48bc-a867-37d682d9b3b1-kube-api-access-pp7g2" (OuterVolumeSpecName: "kube-api-access-pp7g2") pod "07ce2586-6062-48bc-a867-37d682d9b3b1" (UID: "07ce2586-6062-48bc-a867-37d682d9b3b1"). InnerVolumeSpecName "kube-api-access-pp7g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.843528 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805906ee-0f3c-48a2-bbe5-c294c6299888-kube-api-access-8cn2t" (OuterVolumeSpecName: "kube-api-access-8cn2t") pod "805906ee-0f3c-48a2-bbe5-c294c6299888" (UID: "805906ee-0f3c-48a2-bbe5-c294c6299888"). InnerVolumeSpecName "kube-api-access-8cn2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.942508 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07ce2586-6062-48bc-a867-37d682d9b3b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.942543 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp7g2\" (UniqueName: \"kubernetes.io/projected/07ce2586-6062-48bc-a867-37d682d9b3b1-kube-api-access-pp7g2\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.942554 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/805906ee-0f3c-48a2-bbe5-c294c6299888-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.942564 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn8mx\" (UniqueName: \"kubernetes.io/projected/a474cd62-d32b-4059-bacc-f878b03ffbfb-kube-api-access-qn8mx\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:43 crc kubenswrapper[4789]: I1216 08:17:43.942573 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cn2t\" (UniqueName: \"kubernetes.io/projected/805906ee-0f3c-48a2-bbe5-c294c6299888-kube-api-access-8cn2t\") on node \"crc\" DevicePath \"\"" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.257404 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" event={"ID":"2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0","Type":"ContainerDied","Data":"b82d0e004cff215889ed89c7fdcfdcd8184947a9957059020bc1a5034b1a996a"} Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.257455 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82d0e004cff215889ed89c7fdcfdcd8184947a9957059020bc1a5034b1a996a" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.257528 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c4f-account-create-update-kqkc6" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.260451 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.260485 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d3bb-account-create-update-pmmb7" event={"ID":"805906ee-0f3c-48a2-bbe5-c294c6299888","Type":"ContainerDied","Data":"c86478dcf63455997c0bfd1cf31134f931c14ac4c457cacc74daf68e318adc85"} Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.260546 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86478dcf63455997c0bfd1cf31134f931c14ac4c457cacc74daf68e318adc85" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.262419 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fxgvr" event={"ID":"a474cd62-d32b-4059-bacc-f878b03ffbfb","Type":"ContainerDied","Data":"d9192da76ee217a059e10f989fe969ffc6199c4e93f8127c79eed6a61140a28b"} Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.262492 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9192da76ee217a059e10f989fe969ffc6199c4e93f8127c79eed6a61140a28b" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.262590 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxgvr" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.264702 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" event={"ID":"07ce2586-6062-48bc-a867-37d682d9b3b1","Type":"ContainerDied","Data":"6030152c843f1b2cd25aaade292f1a9d23a6d250285e05e60128dcd5d0355968"} Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.264726 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6030152c843f1b2cd25aaade292f1a9d23a6d250285e05e60128dcd5d0355968" Dec 16 08:17:44 crc kubenswrapper[4789]: I1216 08:17:44.264808 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2aee-account-create-update-kzmtw" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.119812 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pskhf"] Dec 16 08:17:45 crc kubenswrapper[4789]: E1216 08:17:45.120411 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83111890-5086-4173-a090-084f8d14334e" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120428 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="83111890-5086-4173-a090-084f8d14334e" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: E1216 08:17:45.120443 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120449 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: E1216 08:17:45.120461 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ce2586-6062-48bc-a867-37d682d9b3b1" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120467 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ce2586-6062-48bc-a867-37d682d9b3b1" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: E1216 08:17:45.120479 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a474cd62-d32b-4059-bacc-f878b03ffbfb" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120486 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a474cd62-d32b-4059-bacc-f878b03ffbfb" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: E1216 08:17:45.120507 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d072154-82fc-4258-943f-1900efa7273c" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120515 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d072154-82fc-4258-943f-1900efa7273c" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: E1216 08:17:45.120528 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805906ee-0f3c-48a2-bbe5-c294c6299888" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120535 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="805906ee-0f3c-48a2-bbe5-c294c6299888" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120694 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="83111890-5086-4173-a090-084f8d14334e" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120712 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="805906ee-0f3c-48a2-bbe5-c294c6299888" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120724 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a474cd62-d32b-4059-bacc-f878b03ffbfb" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120731 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ce2586-6062-48bc-a867-37d682d9b3b1" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120738 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0" containerName="mariadb-account-create-update" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.120750 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d072154-82fc-4258-943f-1900efa7273c" containerName="mariadb-database-create" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.121319 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.124413 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.124584 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dvrkp" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.124779 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.136282 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pskhf"] Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.164742 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.164898 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-scripts\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.164996 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-config-data\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.165058 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rr2r\" (UniqueName: \"kubernetes.io/projected/fd40f148-e2fa-49e9-8ab9-dec31881d548-kube-api-access-6rr2r\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.266089 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.266161 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-scripts\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.266208 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-config-data\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.266227 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rr2r\" (UniqueName: \"kubernetes.io/projected/fd40f148-e2fa-49e9-8ab9-dec31881d548-kube-api-access-6rr2r\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.270016 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-scripts\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.270065 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.270232 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-config-data\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.282795 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rr2r\" (UniqueName: \"kubernetes.io/projected/fd40f148-e2fa-49e9-8ab9-dec31881d548-kube-api-access-6rr2r\") pod \"nova-cell0-conductor-db-sync-pskhf\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.440786 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:17:45 crc kubenswrapper[4789]: I1216 08:17:45.902995 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pskhf"] Dec 16 08:17:46 crc kubenswrapper[4789]: I1216 08:17:46.287407 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pskhf" event={"ID":"fd40f148-e2fa-49e9-8ab9-dec31881d548","Type":"ContainerStarted","Data":"6bccac71fdaee56e7bc990d3df0f427572887bef37b408548e28398863162b1d"} Dec 16 08:18:00 crc kubenswrapper[4789]: E1216 08:18:00.743048 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:18:00 crc kubenswrapper[4789]: E1216 08:18:00.743713 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:18:00 crc kubenswrapper[4789]: E1216 08:18:00.743859 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rr2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-pskhf_openstack(fd40f148-e2fa-49e9-8ab9-dec31881d548): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:18:00 crc kubenswrapper[4789]: E1216 08:18:00.745066 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-pskhf" podUID="fd40f148-e2fa-49e9-8ab9-dec31881d548" Dec 16 08:18:01 crc kubenswrapper[4789]: E1216 08:18:01.418194 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-pskhf" podUID="fd40f148-e2fa-49e9-8ab9-dec31881d548" Dec 16 08:18:12 crc kubenswrapper[4789]: I1216 08:18:12.503970 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pskhf" event={"ID":"fd40f148-e2fa-49e9-8ab9-dec31881d548","Type":"ContainerStarted","Data":"73b8eef75e07b02bf777e0d0ecaaf6909a58423a954892f9c2a775ae8bcd1e38"} Dec 16 08:18:12 crc kubenswrapper[4789]: I1216 08:18:12.523707 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pskhf" podStartSLOduration=1.147076992 podStartE2EDuration="27.523685589s" podCreationTimestamp="2025-12-16 08:17:45 +0000 UTC" firstStartedPulling="2025-12-16 08:17:45.903098688 +0000 UTC m=+5204.164986317" lastFinishedPulling="2025-12-16 08:18:12.279707285 +0000 UTC m=+5230.541594914" observedRunningTime="2025-12-16 08:18:12.51838544 +0000 UTC m=+5230.780273069" watchObservedRunningTime="2025-12-16 08:18:12.523685589 +0000 UTC m=+5230.785573208" Dec 16 08:18:17 crc kubenswrapper[4789]: I1216 08:18:17.546603 4789 generic.go:334] "Generic (PLEG): container finished" podID="fd40f148-e2fa-49e9-8ab9-dec31881d548" containerID="73b8eef75e07b02bf777e0d0ecaaf6909a58423a954892f9c2a775ae8bcd1e38" exitCode=0 Dec 16 08:18:17 crc kubenswrapper[4789]: I1216 08:18:17.546697 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pskhf" event={"ID":"fd40f148-e2fa-49e9-8ab9-dec31881d548","Type":"ContainerDied","Data":"73b8eef75e07b02bf777e0d0ecaaf6909a58423a954892f9c2a775ae8bcd1e38"} Dec 16 08:18:18 crc kubenswrapper[4789]: I1216 08:18:18.916172 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:18:18 crc kubenswrapper[4789]: I1216 08:18:18.980560 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-scripts\") pod \"fd40f148-e2fa-49e9-8ab9-dec31881d548\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " Dec 16 08:18:18 crc kubenswrapper[4789]: I1216 08:18:18.980689 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-config-data\") pod \"fd40f148-e2fa-49e9-8ab9-dec31881d548\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " Dec 16 08:18:18 crc kubenswrapper[4789]: I1216 08:18:18.980779 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rr2r\" (UniqueName: \"kubernetes.io/projected/fd40f148-e2fa-49e9-8ab9-dec31881d548-kube-api-access-6rr2r\") pod \"fd40f148-e2fa-49e9-8ab9-dec31881d548\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " Dec 16 08:18:18 crc kubenswrapper[4789]: I1216 08:18:18.981001 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-combined-ca-bundle\") pod \"fd40f148-e2fa-49e9-8ab9-dec31881d548\" (UID: \"fd40f148-e2fa-49e9-8ab9-dec31881d548\") " Dec 16 08:18:18 crc kubenswrapper[4789]: I1216 08:18:18.987840 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-scripts" (OuterVolumeSpecName: "scripts") pod "fd40f148-e2fa-49e9-8ab9-dec31881d548" (UID: "fd40f148-e2fa-49e9-8ab9-dec31881d548"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:18 crc kubenswrapper[4789]: I1216 08:18:18.987953 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd40f148-e2fa-49e9-8ab9-dec31881d548-kube-api-access-6rr2r" (OuterVolumeSpecName: "kube-api-access-6rr2r") pod "fd40f148-e2fa-49e9-8ab9-dec31881d548" (UID: "fd40f148-e2fa-49e9-8ab9-dec31881d548"). InnerVolumeSpecName "kube-api-access-6rr2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.017856 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-config-data" (OuterVolumeSpecName: "config-data") pod "fd40f148-e2fa-49e9-8ab9-dec31881d548" (UID: "fd40f148-e2fa-49e9-8ab9-dec31881d548"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.019160 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd40f148-e2fa-49e9-8ab9-dec31881d548" (UID: "fd40f148-e2fa-49e9-8ab9-dec31881d548"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.083281 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.083328 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.083339 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd40f148-e2fa-49e9-8ab9-dec31881d548-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.083348 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rr2r\" (UniqueName: \"kubernetes.io/projected/fd40f148-e2fa-49e9-8ab9-dec31881d548-kube-api-access-6rr2r\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.581118 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pskhf" event={"ID":"fd40f148-e2fa-49e9-8ab9-dec31881d548","Type":"ContainerDied","Data":"6bccac71fdaee56e7bc990d3df0f427572887bef37b408548e28398863162b1d"} Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.581179 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pskhf" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.581193 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bccac71fdaee56e7bc990d3df0f427572887bef37b408548e28398863162b1d" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.650540 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:18:19 crc kubenswrapper[4789]: E1216 08:18:19.650974 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd40f148-e2fa-49e9-8ab9-dec31881d548" containerName="nova-cell0-conductor-db-sync" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.650992 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd40f148-e2fa-49e9-8ab9-dec31881d548" containerName="nova-cell0-conductor-db-sync" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.651230 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd40f148-e2fa-49e9-8ab9-dec31881d548" containerName="nova-cell0-conductor-db-sync" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.651877 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.655433 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dvrkp" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.655637 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.662954 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.693632 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.693681 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.693710 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4zm7\" (UniqueName: \"kubernetes.io/projected/ec472b51-08c1-499e-8b85-e103741b35d8-kube-api-access-l4zm7\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.795298 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4zm7\" (UniqueName: \"kubernetes.io/projected/ec472b51-08c1-499e-8b85-e103741b35d8-kube-api-access-l4zm7\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.795496 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.795543 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.800938 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.801178 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.812155 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4zm7\" (UniqueName: \"kubernetes.io/projected/ec472b51-08c1-499e-8b85-e103741b35d8-kube-api-access-l4zm7\") pod \"nova-cell0-conductor-0\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:19 crc kubenswrapper[4789]: I1216 08:18:19.980291 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:20 crc kubenswrapper[4789]: I1216 08:18:20.422419 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:18:20 crc kubenswrapper[4789]: I1216 08:18:20.597388 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec472b51-08c1-499e-8b85-e103741b35d8","Type":"ContainerStarted","Data":"45b55c2c3a485926d1efa79d1a89907c26ad2c0dcf8849e883300145d34d947a"} Dec 16 08:18:21 crc kubenswrapper[4789]: I1216 08:18:21.608278 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec472b51-08c1-499e-8b85-e103741b35d8","Type":"ContainerStarted","Data":"dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22"} Dec 16 08:18:21 crc kubenswrapper[4789]: I1216 08:18:21.609594 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:21 crc kubenswrapper[4789]: I1216 08:18:21.641098 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.641079358 podStartE2EDuration="2.641079358s" podCreationTimestamp="2025-12-16 08:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:21.624291837 +0000 UTC m=+5239.886179486" watchObservedRunningTime="2025-12-16 08:18:21.641079358 +0000 UTC m=+5239.902966987" Dec 16 08:18:21 crc kubenswrapper[4789]: I1216 08:18:21.928388 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:18:21 crc kubenswrapper[4789]: I1216 08:18:21.928462 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.002283 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.047115 4789 scope.go:117] "RemoveContainer" containerID="8061226e9f7920b0df6606b4cfb42c79c665529ac1476c77a0b98d6e652476e6" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.413114 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xm7gp"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.414344 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.416409 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.416550 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.427879 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm7gp"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.515457 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-467jb\" (UniqueName: \"kubernetes.io/projected/be1b68ad-31cf-492b-a3c6-eae046daf5e0-kube-api-access-467jb\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.515819 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-config-data\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.515970 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.515994 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-scripts\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.538363 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.539730 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.542220 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.554864 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.571333 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.572534 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.577347 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.604204 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619430 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-467jb\" (UniqueName: \"kubernetes.io/projected/be1b68ad-31cf-492b-a3c6-eae046daf5e0-kube-api-access-467jb\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619515 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-config-data\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619561 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9ldr\" (UniqueName: \"kubernetes.io/projected/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-kube-api-access-m9ldr\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619600 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-config-data\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619628 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619675 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619697 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-scripts\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.619744 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-logs\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.639041 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-scripts\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.639685 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-config-data\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.644485 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.649251 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-467jb\" (UniqueName: \"kubernetes.io/projected/be1b68ad-31cf-492b-a3c6-eae046daf5e0-kube-api-access-467jb\") pod \"nova-cell0-cell-mapping-xm7gp\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.671011 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.672238 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.674215 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.683293 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.688508 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.699294 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.704041 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.721592 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9ldr\" (UniqueName: \"kubernetes.io/projected/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-kube-api-access-m9ldr\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.721850 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-config-data\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.721957 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.722063 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.722151 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-logs\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.722238 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.722316 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjp9q\" (UniqueName: \"kubernetes.io/projected/7864394a-9bf8-40d1-b8a8-8b5b989516bb-kube-api-access-fjp9q\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.728526 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-config-data\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.728807 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-logs\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.732117 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.732747 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.762792 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.769289 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9ldr\" (UniqueName: \"kubernetes.io/projected/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-kube-api-access-m9ldr\") pod \"nova-api-0\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.823002 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7db745fdc9-fvwzc"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.824576 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.826974 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d189db-d38f-485b-92a4-eee68fab0902-logs\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827022 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827049 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmxpx\" (UniqueName: \"kubernetes.io/projected/39df3a48-1eb0-4ffd-b69e-c047588efd4c-kube-api-access-zmxpx\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827100 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827132 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-config-data\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827162 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827185 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjp9q\" (UniqueName: \"kubernetes.io/projected/7864394a-9bf8-40d1-b8a8-8b5b989516bb-kube-api-access-fjp9q\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827204 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827219 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvvh\" (UniqueName: \"kubernetes.io/projected/44d189db-d38f-485b-92a4-eee68fab0902-kube-api-access-9mvvh\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.827254 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-config-data\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.835313 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7db745fdc9-fvwzc"] Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.836540 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.843240 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.851616 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjp9q\" (UniqueName: \"kubernetes.io/projected/7864394a-9bf8-40d1-b8a8-8b5b989516bb-kube-api-access-fjp9q\") pod \"nova-cell1-novncproxy-0\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.857319 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.906287 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.929492 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d189db-d38f-485b-92a4-eee68fab0902-logs\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.929869 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.929903 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmxpx\" (UniqueName: \"kubernetes.io/projected/39df3a48-1eb0-4ffd-b69e-c047588efd4c-kube-api-access-zmxpx\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.929957 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-nb\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930016 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-config-data\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930011 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d189db-d38f-485b-92a4-eee68fab0902-logs\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930116 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-sb\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930182 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-dns-svc\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930240 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930263 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvvh\" (UniqueName: \"kubernetes.io/projected/44d189db-d38f-485b-92a4-eee68fab0902-kube-api-access-9mvvh\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930378 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-config\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930504 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-config-data\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.930535 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hl7h\" (UniqueName: \"kubernetes.io/projected/00ebde91-a24c-4979-a38c-b69318f1a615-kube-api-access-5hl7h\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.938640 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.939887 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-config-data\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.941589 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-config-data\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.942087 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.944854 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmxpx\" (UniqueName: \"kubernetes.io/projected/39df3a48-1eb0-4ffd-b69e-c047588efd4c-kube-api-access-zmxpx\") pod \"nova-scheduler-0\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:30 crc kubenswrapper[4789]: I1216 08:18:30.950484 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvvh\" (UniqueName: \"kubernetes.io/projected/44d189db-d38f-485b-92a4-eee68fab0902-kube-api-access-9mvvh\") pod \"nova-metadata-0\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " pod="openstack/nova-metadata-0" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.034843 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-sb\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.034887 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-dns-svc\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.034974 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-config\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.035013 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hl7h\" (UniqueName: \"kubernetes.io/projected/00ebde91-a24c-4979-a38c-b69318f1a615-kube-api-access-5hl7h\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.035094 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-nb\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.035957 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-nb\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.036451 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-sb\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.036979 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-dns-svc\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.037487 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-config\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.065019 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hl7h\" (UniqueName: \"kubernetes.io/projected/00ebde91-a24c-4979-a38c-b69318f1a615-kube-api-access-5hl7h\") pod \"dnsmasq-dns-7db745fdc9-fvwzc\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.121278 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.128261 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.141370 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.350161 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm7gp"] Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.421214 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:31 crc kubenswrapper[4789]: W1216 08:18:31.424006 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a05a588_5b7a_4f3c_99e8_4e84e1110dfb.slice/crio-bb8c48b7e118bb0a115b80f1c6ca8f039663449ad7361567497be154a6911519 WatchSource:0}: Error finding container bb8c48b7e118bb0a115b80f1c6ca8f039663449ad7361567497be154a6911519: Status 404 returned error can't find the container with id bb8c48b7e118bb0a115b80f1c6ca8f039663449ad7361567497be154a6911519 Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.490410 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k2fd8"] Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.492433 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.497655 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.497867 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.523986 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k2fd8"] Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.549891 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:18:31 crc kubenswrapper[4789]: W1216 08:18:31.556221 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7864394a_9bf8_40d1_b8a8_8b5b989516bb.slice/crio-4cfc6452302d4d167c76cf40ad3bee41b36084805c1d77d7b712927c5f64decb WatchSource:0}: Error finding container 4cfc6452302d4d167c76cf40ad3bee41b36084805c1d77d7b712927c5f64decb: Status 404 returned error can't find the container with id 4cfc6452302d4d167c76cf40ad3bee41b36084805c1d77d7b712927c5f64decb Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.644627 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-config-data\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.644674 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k9fp\" (UniqueName: \"kubernetes.io/projected/452f63cc-39ba-453e-89dc-c8537fd2ff30-kube-api-access-5k9fp\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.645047 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-scripts\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.645119 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.667151 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:31 crc kubenswrapper[4789]: W1216 08:18:31.669044 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44d189db_d38f_485b_92a4_eee68fab0902.slice/crio-336d6013c9df11539d36fc8ab6dbc0bd5af3ddd6fd70853c64374f169eba70ab WatchSource:0}: Error finding container 336d6013c9df11539d36fc8ab6dbc0bd5af3ddd6fd70853c64374f169eba70ab: Status 404 returned error can't find the container with id 336d6013c9df11539d36fc8ab6dbc0bd5af3ddd6fd70853c64374f169eba70ab Dec 16 08:18:31 crc kubenswrapper[4789]: W1216 08:18:31.672572 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39df3a48_1eb0_4ffd_b69e_c047588efd4c.slice/crio-f4335f25eb6b275f126908394d8bee0e6e011b5a20a32fca03081b21e0c1a4d5 WatchSource:0}: Error finding container f4335f25eb6b275f126908394d8bee0e6e011b5a20a32fca03081b21e0c1a4d5: Status 404 returned error can't find the container with id f4335f25eb6b275f126908394d8bee0e6e011b5a20a32fca03081b21e0c1a4d5 Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.674582 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.725268 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"39df3a48-1eb0-4ffd-b69e-c047588efd4c","Type":"ContainerStarted","Data":"f4335f25eb6b275f126908394d8bee0e6e011b5a20a32fca03081b21e0c1a4d5"} Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.726309 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm7gp" event={"ID":"be1b68ad-31cf-492b-a3c6-eae046daf5e0","Type":"ContainerStarted","Data":"159a9e7f43814a754524a3fb9e4f95c4183b1243de95a405819500c40be45b9b"} Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.727016 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7864394a-9bf8-40d1-b8a8-8b5b989516bb","Type":"ContainerStarted","Data":"4cfc6452302d4d167c76cf40ad3bee41b36084805c1d77d7b712927c5f64decb"} Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.733441 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44d189db-d38f-485b-92a4-eee68fab0902","Type":"ContainerStarted","Data":"336d6013c9df11539d36fc8ab6dbc0bd5af3ddd6fd70853c64374f169eba70ab"} Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.734813 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb","Type":"ContainerStarted","Data":"bb8c48b7e118bb0a115b80f1c6ca8f039663449ad7361567497be154a6911519"} Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.746378 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-scripts\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.746472 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.746608 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-config-data\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.746646 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k9fp\" (UniqueName: \"kubernetes.io/projected/452f63cc-39ba-453e-89dc-c8537fd2ff30-kube-api-access-5k9fp\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.753956 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.754003 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-config-data\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.755343 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-scripts\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.768567 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k9fp\" (UniqueName: \"kubernetes.io/projected/452f63cc-39ba-453e-89dc-c8537fd2ff30-kube-api-access-5k9fp\") pod \"nova-cell1-conductor-db-sync-k2fd8\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:31 crc kubenswrapper[4789]: W1216 08:18:31.844607 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00ebde91_a24c_4979_a38c_b69318f1a615.slice/crio-4e905d8f2e0e09272c1667add69458a6bd100021f438503b719b88afae0af989 WatchSource:0}: Error finding container 4e905d8f2e0e09272c1667add69458a6bd100021f438503b719b88afae0af989: Status 404 returned error can't find the container with id 4e905d8f2e0e09272c1667add69458a6bd100021f438503b719b88afae0af989 Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.848903 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7db745fdc9-fvwzc"] Dec 16 08:18:31 crc kubenswrapper[4789]: I1216 08:18:31.872226 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.297552 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k2fd8"] Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.745149 4789 generic.go:334] "Generic (PLEG): container finished" podID="00ebde91-a24c-4979-a38c-b69318f1a615" containerID="e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988" exitCode=0 Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.745243 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" event={"ID":"00ebde91-a24c-4979-a38c-b69318f1a615","Type":"ContainerDied","Data":"e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988"} Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.745493 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" event={"ID":"00ebde91-a24c-4979-a38c-b69318f1a615","Type":"ContainerStarted","Data":"4e905d8f2e0e09272c1667add69458a6bd100021f438503b719b88afae0af989"} Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.750400 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" event={"ID":"452f63cc-39ba-453e-89dc-c8537fd2ff30","Type":"ContainerStarted","Data":"cba21a7f86780aeb9cebb526e51452f845c0224e19e738f525da12f2ed20815b"} Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.750427 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" event={"ID":"452f63cc-39ba-453e-89dc-c8537fd2ff30","Type":"ContainerStarted","Data":"a7f646dea8189b0275b597c81ce45a4061dfe6e01012f1d14be9ac3e2a2c937d"} Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.752735 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm7gp" event={"ID":"be1b68ad-31cf-492b-a3c6-eae046daf5e0","Type":"ContainerStarted","Data":"c9867257d345bac4aba9e637fc89197fa8bbfdbd1ebeb7adaa9454fa842cd0f4"} Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.788061 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" podStartSLOduration=1.7880439510000001 podStartE2EDuration="1.788043951s" podCreationTimestamp="2025-12-16 08:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:32.779314757 +0000 UTC m=+5251.041202386" watchObservedRunningTime="2025-12-16 08:18:32.788043951 +0000 UTC m=+5251.049931580" Dec 16 08:18:32 crc kubenswrapper[4789]: I1216 08:18:32.802702 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xm7gp" podStartSLOduration=2.802685908 podStartE2EDuration="2.802685908s" podCreationTimestamp="2025-12-16 08:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:32.795610345 +0000 UTC m=+5251.057497974" watchObservedRunningTime="2025-12-16 08:18:32.802685908 +0000 UTC m=+5251.064573537" Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.821557 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44d189db-d38f-485b-92a4-eee68fab0902","Type":"ContainerStarted","Data":"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.822080 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44d189db-d38f-485b-92a4-eee68fab0902","Type":"ContainerStarted","Data":"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.824212 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb","Type":"ContainerStarted","Data":"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.824257 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb","Type":"ContainerStarted","Data":"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.826335 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"39df3a48-1eb0-4ffd-b69e-c047588efd4c","Type":"ContainerStarted","Data":"569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.828015 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7864394a-9bf8-40d1-b8a8-8b5b989516bb","Type":"ContainerStarted","Data":"9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.832089 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" event={"ID":"00ebde91-a24c-4979-a38c-b69318f1a615","Type":"ContainerStarted","Data":"e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.832208 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.833714 4789 generic.go:334] "Generic (PLEG): container finished" podID="452f63cc-39ba-453e-89dc-c8537fd2ff30" containerID="cba21a7f86780aeb9cebb526e51452f845c0224e19e738f525da12f2ed20815b" exitCode=0 Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.833747 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" event={"ID":"452f63cc-39ba-453e-89dc-c8537fd2ff30","Type":"ContainerDied","Data":"cba21a7f86780aeb9cebb526e51452f845c0224e19e738f525da12f2ed20815b"} Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.841462 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.439136401 podStartE2EDuration="5.841447922s" podCreationTimestamp="2025-12-16 08:18:30 +0000 UTC" firstStartedPulling="2025-12-16 08:18:31.671175688 +0000 UTC m=+5249.933063317" lastFinishedPulling="2025-12-16 08:18:35.073487209 +0000 UTC m=+5253.335374838" observedRunningTime="2025-12-16 08:18:35.835883866 +0000 UTC m=+5254.097771495" watchObservedRunningTime="2025-12-16 08:18:35.841447922 +0000 UTC m=+5254.103335541" Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.863256 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.349905 podStartE2EDuration="5.863235175s" podCreationTimestamp="2025-12-16 08:18:30 +0000 UTC" firstStartedPulling="2025-12-16 08:18:31.558136224 +0000 UTC m=+5249.820023853" lastFinishedPulling="2025-12-16 08:18:35.071466399 +0000 UTC m=+5253.333354028" observedRunningTime="2025-12-16 08:18:35.855534616 +0000 UTC m=+5254.117422245" watchObservedRunningTime="2025-12-16 08:18:35.863235175 +0000 UTC m=+5254.125122804" Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.877351 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.477212332 podStartE2EDuration="5.877332339s" podCreationTimestamp="2025-12-16 08:18:30 +0000 UTC" firstStartedPulling="2025-12-16 08:18:31.674207072 +0000 UTC m=+5249.936094701" lastFinishedPulling="2025-12-16 08:18:35.074327079 +0000 UTC m=+5253.336214708" observedRunningTime="2025-12-16 08:18:35.872004889 +0000 UTC m=+5254.133892538" watchObservedRunningTime="2025-12-16 08:18:35.877332339 +0000 UTC m=+5254.139219968" Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.903178 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" podStartSLOduration=5.90315432 podStartE2EDuration="5.90315432s" podCreationTimestamp="2025-12-16 08:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:35.891014964 +0000 UTC m=+5254.152902623" watchObservedRunningTime="2025-12-16 08:18:35.90315432 +0000 UTC m=+5254.165041949" Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.911027 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:35 crc kubenswrapper[4789]: I1216 08:18:35.916325 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.25460912 podStartE2EDuration="5.916306872s" podCreationTimestamp="2025-12-16 08:18:30 +0000 UTC" firstStartedPulling="2025-12-16 08:18:31.426575578 +0000 UTC m=+5249.688463207" lastFinishedPulling="2025-12-16 08:18:35.08827333 +0000 UTC m=+5253.350160959" observedRunningTime="2025-12-16 08:18:35.910620443 +0000 UTC m=+5254.172508072" watchObservedRunningTime="2025-12-16 08:18:35.916306872 +0000 UTC m=+5254.178194491" Dec 16 08:18:36 crc kubenswrapper[4789]: I1216 08:18:36.136653 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 08:18:36 crc kubenswrapper[4789]: I1216 08:18:36.136700 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:18:36 crc kubenswrapper[4789]: I1216 08:18:36.136710 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:18:36 crc kubenswrapper[4789]: I1216 08:18:36.844822 4789 generic.go:334] "Generic (PLEG): container finished" podID="be1b68ad-31cf-492b-a3c6-eae046daf5e0" containerID="c9867257d345bac4aba9e637fc89197fa8bbfdbd1ebeb7adaa9454fa842cd0f4" exitCode=0 Dec 16 08:18:36 crc kubenswrapper[4789]: I1216 08:18:36.845039 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm7gp" event={"ID":"be1b68ad-31cf-492b-a3c6-eae046daf5e0","Type":"ContainerDied","Data":"c9867257d345bac4aba9e637fc89197fa8bbfdbd1ebeb7adaa9454fa842cd0f4"} Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.244970 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.359014 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k9fp\" (UniqueName: \"kubernetes.io/projected/452f63cc-39ba-453e-89dc-c8537fd2ff30-kube-api-access-5k9fp\") pod \"452f63cc-39ba-453e-89dc-c8537fd2ff30\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.359143 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-scripts\") pod \"452f63cc-39ba-453e-89dc-c8537fd2ff30\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.359288 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-combined-ca-bundle\") pod \"452f63cc-39ba-453e-89dc-c8537fd2ff30\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.359316 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-config-data\") pod \"452f63cc-39ba-453e-89dc-c8537fd2ff30\" (UID: \"452f63cc-39ba-453e-89dc-c8537fd2ff30\") " Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.370863 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-scripts" (OuterVolumeSpecName: "scripts") pod "452f63cc-39ba-453e-89dc-c8537fd2ff30" (UID: "452f63cc-39ba-453e-89dc-c8537fd2ff30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.370979 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452f63cc-39ba-453e-89dc-c8537fd2ff30-kube-api-access-5k9fp" (OuterVolumeSpecName: "kube-api-access-5k9fp") pod "452f63cc-39ba-453e-89dc-c8537fd2ff30" (UID: "452f63cc-39ba-453e-89dc-c8537fd2ff30"). InnerVolumeSpecName "kube-api-access-5k9fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.387487 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "452f63cc-39ba-453e-89dc-c8537fd2ff30" (UID: "452f63cc-39ba-453e-89dc-c8537fd2ff30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.389256 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-config-data" (OuterVolumeSpecName: "config-data") pod "452f63cc-39ba-453e-89dc-c8537fd2ff30" (UID: "452f63cc-39ba-453e-89dc-c8537fd2ff30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.461764 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k9fp\" (UniqueName: \"kubernetes.io/projected/452f63cc-39ba-453e-89dc-c8537fd2ff30-kube-api-access-5k9fp\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.461800 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.461819 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.461830 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452f63cc-39ba-453e-89dc-c8537fd2ff30-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.858349 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" event={"ID":"452f63cc-39ba-453e-89dc-c8537fd2ff30","Type":"ContainerDied","Data":"a7f646dea8189b0275b597c81ce45a4061dfe6e01012f1d14be9ac3e2a2c937d"} Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.858952 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f646dea8189b0275b597c81ce45a4061dfe6e01012f1d14be9ac3e2a2c937d" Dec 16 08:18:37 crc kubenswrapper[4789]: I1216 08:18:37.858553 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k2fd8" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.020365 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:18:38 crc kubenswrapper[4789]: E1216 08:18:38.021144 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452f63cc-39ba-453e-89dc-c8537fd2ff30" containerName="nova-cell1-conductor-db-sync" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.021169 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="452f63cc-39ba-453e-89dc-c8537fd2ff30" containerName="nova-cell1-conductor-db-sync" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.021388 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="452f63cc-39ba-453e-89dc-c8537fd2ff30" containerName="nova-cell1-conductor-db-sync" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.023233 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.026595 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.029128 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.078416 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qgxn\" (UniqueName: \"kubernetes.io/projected/3a2c87a8-8e65-4763-9ae8-1507026f0904-kube-api-access-6qgxn\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.078496 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.078944 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.180714 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qgxn\" (UniqueName: \"kubernetes.io/projected/3a2c87a8-8e65-4763-9ae8-1507026f0904-kube-api-access-6qgxn\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.180832 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.180920 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.186460 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.186865 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.196874 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qgxn\" (UniqueName: \"kubernetes.io/projected/3a2c87a8-8e65-4763-9ae8-1507026f0904-kube-api-access-6qgxn\") pod \"nova-cell1-conductor-0\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.278529 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.348289 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.384708 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-scripts\") pod \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.384940 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-combined-ca-bundle\") pod \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.384977 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-config-data\") pod \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.385036 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-467jb\" (UniqueName: \"kubernetes.io/projected/be1b68ad-31cf-492b-a3c6-eae046daf5e0-kube-api-access-467jb\") pod \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\" (UID: \"be1b68ad-31cf-492b-a3c6-eae046daf5e0\") " Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.389460 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-scripts" (OuterVolumeSpecName: "scripts") pod "be1b68ad-31cf-492b-a3c6-eae046daf5e0" (UID: "be1b68ad-31cf-492b-a3c6-eae046daf5e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.389679 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1b68ad-31cf-492b-a3c6-eae046daf5e0-kube-api-access-467jb" (OuterVolumeSpecName: "kube-api-access-467jb") pod "be1b68ad-31cf-492b-a3c6-eae046daf5e0" (UID: "be1b68ad-31cf-492b-a3c6-eae046daf5e0"). InnerVolumeSpecName "kube-api-access-467jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.409967 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-config-data" (OuterVolumeSpecName: "config-data") pod "be1b68ad-31cf-492b-a3c6-eae046daf5e0" (UID: "be1b68ad-31cf-492b-a3c6-eae046daf5e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.411556 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be1b68ad-31cf-492b-a3c6-eae046daf5e0" (UID: "be1b68ad-31cf-492b-a3c6-eae046daf5e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.487219 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.487248 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.487258 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-467jb\" (UniqueName: \"kubernetes.io/projected/be1b68ad-31cf-492b-a3c6-eae046daf5e0-kube-api-access-467jb\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.487269 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be1b68ad-31cf-492b-a3c6-eae046daf5e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.869388 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm7gp" event={"ID":"be1b68ad-31cf-492b-a3c6-eae046daf5e0","Type":"ContainerDied","Data":"159a9e7f43814a754524a3fb9e4f95c4183b1243de95a405819500c40be45b9b"} Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.869744 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159a9e7f43814a754524a3fb9e4f95c4183b1243de95a405819500c40be45b9b" Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.869471 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm7gp" Dec 16 08:18:38 crc kubenswrapper[4789]: W1216 08:18:38.944516 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a2c87a8_8e65_4763_9ae8_1507026f0904.slice/crio-a46b582c389621d19223119e37163b46b6526a08f9d616739533dec50ad549f2 WatchSource:0}: Error finding container a46b582c389621d19223119e37163b46b6526a08f9d616739533dec50ad549f2: Status 404 returned error can't find the container with id a46b582c389621d19223119e37163b46b6526a08f9d616739533dec50ad549f2 Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.959674 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.983209 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:38 crc kubenswrapper[4789]: I1216 08:18:38.983468 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="39df3a48-1eb0-4ffd-b69e-c047588efd4c" containerName="nova-scheduler-scheduler" containerID="cri-o://569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a" gracePeriod=30 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.001377 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.001669 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-log" containerID="cri-o://ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24" gracePeriod=30 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.001825 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-api" containerID="cri-o://841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736" gracePeriod=30 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.042254 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.043324 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-metadata" containerID="cri-o://6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba" gracePeriod=30 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.043317 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-log" containerID="cri-o://e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d" gracePeriod=30 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.658688 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.666780 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.718188 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-combined-ca-bundle\") pod \"44d189db-d38f-485b-92a4-eee68fab0902\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.718302 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d189db-d38f-485b-92a4-eee68fab0902-logs\") pod \"44d189db-d38f-485b-92a4-eee68fab0902\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.718380 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mvvh\" (UniqueName: \"kubernetes.io/projected/44d189db-d38f-485b-92a4-eee68fab0902-kube-api-access-9mvvh\") pod \"44d189db-d38f-485b-92a4-eee68fab0902\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.718480 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-config-data\") pod \"44d189db-d38f-485b-92a4-eee68fab0902\" (UID: \"44d189db-d38f-485b-92a4-eee68fab0902\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.719389 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d189db-d38f-485b-92a4-eee68fab0902-logs" (OuterVolumeSpecName: "logs") pod "44d189db-d38f-485b-92a4-eee68fab0902" (UID: "44d189db-d38f-485b-92a4-eee68fab0902"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.723950 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d189db-d38f-485b-92a4-eee68fab0902-kube-api-access-9mvvh" (OuterVolumeSpecName: "kube-api-access-9mvvh") pod "44d189db-d38f-485b-92a4-eee68fab0902" (UID: "44d189db-d38f-485b-92a4-eee68fab0902"). InnerVolumeSpecName "kube-api-access-9mvvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.746241 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-config-data" (OuterVolumeSpecName: "config-data") pod "44d189db-d38f-485b-92a4-eee68fab0902" (UID: "44d189db-d38f-485b-92a4-eee68fab0902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.757071 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44d189db-d38f-485b-92a4-eee68fab0902" (UID: "44d189db-d38f-485b-92a4-eee68fab0902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.819801 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9ldr\" (UniqueName: \"kubernetes.io/projected/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-kube-api-access-m9ldr\") pod \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.820397 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-combined-ca-bundle\") pod \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.820446 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-config-data\") pod \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.820479 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-logs\") pod \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\" (UID: \"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb\") " Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.820840 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-logs" (OuterVolumeSpecName: "logs") pod "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" (UID: "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.820975 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.820989 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44d189db-d38f-485b-92a4-eee68fab0902-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.821000 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mvvh\" (UniqueName: \"kubernetes.io/projected/44d189db-d38f-485b-92a4-eee68fab0902-kube-api-access-9mvvh\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.821011 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44d189db-d38f-485b-92a4-eee68fab0902-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.824741 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-kube-api-access-m9ldr" (OuterVolumeSpecName: "kube-api-access-m9ldr") pod "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" (UID: "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb"). InnerVolumeSpecName "kube-api-access-m9ldr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.844462 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" (UID: "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.850225 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-config-data" (OuterVolumeSpecName: "config-data") pod "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" (UID: "8a05a588-5b7a-4f3c-99e8-4e84e1110dfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.885765 4789 generic.go:334] "Generic (PLEG): container finished" podID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerID="841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736" exitCode=0 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.885798 4789 generic.go:334] "Generic (PLEG): container finished" podID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerID="ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24" exitCode=143 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.885870 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.885879 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb","Type":"ContainerDied","Data":"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.887252 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb","Type":"ContainerDied","Data":"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.887280 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a05a588-5b7a-4f3c-99e8-4e84e1110dfb","Type":"ContainerDied","Data":"bb8c48b7e118bb0a115b80f1c6ca8f039663449ad7361567497be154a6911519"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.887305 4789 scope.go:117] "RemoveContainer" containerID="841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.892262 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3a2c87a8-8e65-4763-9ae8-1507026f0904","Type":"ContainerStarted","Data":"8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.892302 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3a2c87a8-8e65-4763-9ae8-1507026f0904","Type":"ContainerStarted","Data":"a46b582c389621d19223119e37163b46b6526a08f9d616739533dec50ad549f2"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.892408 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.895539 4789 generic.go:334] "Generic (PLEG): container finished" podID="44d189db-d38f-485b-92a4-eee68fab0902" containerID="6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba" exitCode=0 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.895574 4789 generic.go:334] "Generic (PLEG): container finished" podID="44d189db-d38f-485b-92a4-eee68fab0902" containerID="e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d" exitCode=143 Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.895584 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.895598 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44d189db-d38f-485b-92a4-eee68fab0902","Type":"ContainerDied","Data":"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.895627 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44d189db-d38f-485b-92a4-eee68fab0902","Type":"ContainerDied","Data":"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.895641 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"44d189db-d38f-485b-92a4-eee68fab0902","Type":"ContainerDied","Data":"336d6013c9df11539d36fc8ab6dbc0bd5af3ddd6fd70853c64374f169eba70ab"} Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.907107 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.9070908380000002 podStartE2EDuration="2.907090838s" podCreationTimestamp="2025-12-16 08:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:39.90471586 +0000 UTC m=+5258.166603489" watchObservedRunningTime="2025-12-16 08:18:39.907090838 +0000 UTC m=+5258.168978467" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.923185 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9ldr\" (UniqueName: \"kubernetes.io/projected/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-kube-api-access-m9ldr\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.923213 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.923223 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.923235 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.933168 4789 scope.go:117] "RemoveContainer" containerID="ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.957541 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.963058 4789 scope.go:117] "RemoveContainer" containerID="841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736" Dec 16 08:18:39 crc kubenswrapper[4789]: E1216 08:18:39.963517 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736\": container with ID starting with 841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736 not found: ID does not exist" containerID="841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.963559 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736"} err="failed to get container status \"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736\": rpc error: code = NotFound desc = could not find container \"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736\": container with ID starting with 841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736 not found: ID does not exist" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.963588 4789 scope.go:117] "RemoveContainer" containerID="ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24" Dec 16 08:18:39 crc kubenswrapper[4789]: E1216 08:18:39.964405 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24\": container with ID starting with ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24 not found: ID does not exist" containerID="ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.964449 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24"} err="failed to get container status \"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24\": rpc error: code = NotFound desc = could not find container \"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24\": container with ID starting with ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24 not found: ID does not exist" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.964478 4789 scope.go:117] "RemoveContainer" containerID="841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.966151 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736"} err="failed to get container status \"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736\": rpc error: code = NotFound desc = could not find container \"841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736\": container with ID starting with 841146ef70cd5e18b5624b6f23a0b4b371fd728f0459a0f68a5fd9826d6fb736 not found: ID does not exist" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.966258 4789 scope.go:117] "RemoveContainer" containerID="ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.966646 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24"} err="failed to get container status \"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24\": rpc error: code = NotFound desc = could not find container \"ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24\": container with ID starting with ee1da7e4f6a427dfe3a4556960a8aebd69a50789ad1290f75bc0745ed9402b24 not found: ID does not exist" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.966681 4789 scope.go:117] "RemoveContainer" containerID="6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.976657 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992023 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:39 crc kubenswrapper[4789]: E1216 08:18:39.992427 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-log" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992444 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-log" Dec 16 08:18:39 crc kubenswrapper[4789]: E1216 08:18:39.992455 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-metadata" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992461 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-metadata" Dec 16 08:18:39 crc kubenswrapper[4789]: E1216 08:18:39.992474 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-log" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992481 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-log" Dec 16 08:18:39 crc kubenswrapper[4789]: E1216 08:18:39.992500 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-api" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992506 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-api" Dec 16 08:18:39 crc kubenswrapper[4789]: E1216 08:18:39.992521 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1b68ad-31cf-492b-a3c6-eae046daf5e0" containerName="nova-manage" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992527 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1b68ad-31cf-492b-a3c6-eae046daf5e0" containerName="nova-manage" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992696 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1b68ad-31cf-492b-a3c6-eae046daf5e0" containerName="nova-manage" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992710 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-api" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992719 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-metadata" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992731 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" containerName="nova-api-log" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.992742 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d189db-d38f-485b-92a4-eee68fab0902" containerName="nova-metadata-log" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.994021 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:18:39 crc kubenswrapper[4789]: I1216 08:18:39.999101 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.000079 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.002229 4789 scope.go:117] "RemoveContainer" containerID="e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.010159 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.022071 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.023685 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.029657 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.033958 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.051641 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.053172 4789 scope.go:117] "RemoveContainer" containerID="6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba" Dec 16 08:18:40 crc kubenswrapper[4789]: E1216 08:18:40.053548 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba\": container with ID starting with 6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba not found: ID does not exist" containerID="6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.053583 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba"} err="failed to get container status \"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba\": rpc error: code = NotFound desc = could not find container \"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba\": container with ID starting with 6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba not found: ID does not exist" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.053608 4789 scope.go:117] "RemoveContainer" containerID="e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d" Dec 16 08:18:40 crc kubenswrapper[4789]: E1216 08:18:40.054033 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d\": container with ID starting with e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d not found: ID does not exist" containerID="e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.054101 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d"} err="failed to get container status \"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d\": rpc error: code = NotFound desc = could not find container \"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d\": container with ID starting with e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d not found: ID does not exist" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.054127 4789 scope.go:117] "RemoveContainer" containerID="6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.054473 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba"} err="failed to get container status \"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba\": rpc error: code = NotFound desc = could not find container \"6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba\": container with ID starting with 6b462f607e37fd8aa5d8583f75b744eef239fea3202015249b60c594a02247ba not found: ID does not exist" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.054499 4789 scope.go:117] "RemoveContainer" containerID="e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.054684 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d"} err="failed to get container status \"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d\": rpc error: code = NotFound desc = could not find container \"e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d\": container with ID starting with e97411574cf64e48e72282ead87b4f5f481bccc234df0df016679db50b936e9d not found: ID does not exist" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.115328 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d189db-d38f-485b-92a4-eee68fab0902" path="/var/lib/kubelet/pods/44d189db-d38f-485b-92a4-eee68fab0902/volumes" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.116234 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a05a588-5b7a-4f3c-99e8-4e84e1110dfb" path="/var/lib/kubelet/pods/8a05a588-5b7a-4f3c-99e8-4e84e1110dfb/volumes" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126198 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-config-data\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126294 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126328 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdttt\" (UniqueName: \"kubernetes.io/projected/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-kube-api-access-hdttt\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126371 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-logs\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126414 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-config-data\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126461 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-logs\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126485 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.126528 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzdhw\" (UniqueName: \"kubernetes.io/projected/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-kube-api-access-rzdhw\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.228370 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-config-data\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.228903 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-logs\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.229019 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.229110 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzdhw\" (UniqueName: \"kubernetes.io/projected/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-kube-api-access-rzdhw\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.229297 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-config-data\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.229578 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-logs\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.229792 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.230064 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdttt\" (UniqueName: \"kubernetes.io/projected/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-kube-api-access-hdttt\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.230426 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-logs\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.234603 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.235665 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-logs\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.243770 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-config-data\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.244591 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-config-data\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.245106 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.258155 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzdhw\" (UniqueName: \"kubernetes.io/projected/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-kube-api-access-rzdhw\") pod \"nova-metadata-0\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.259516 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdttt\" (UniqueName: \"kubernetes.io/projected/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-kube-api-access-hdttt\") pod \"nova-api-0\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.322193 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.342843 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.765743 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:40 crc kubenswrapper[4789]: W1216 08:18:40.818841 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2845e7b5_15e3_4e42_951f_cf4383ba6c5e.slice/crio-31a4f795dc70cf54408686533150fa1695e2073e505e264a0aefc19c5c06b889 WatchSource:0}: Error finding container 31a4f795dc70cf54408686533150fa1695e2073e505e264a0aefc19c5c06b889: Status 404 returned error can't find the container with id 31a4f795dc70cf54408686533150fa1695e2073e505e264a0aefc19c5c06b889 Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.819334 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.851289 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-config-data\") pod \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.851343 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-combined-ca-bundle\") pod \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.851380 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmxpx\" (UniqueName: \"kubernetes.io/projected/39df3a48-1eb0-4ffd-b69e-c047588efd4c-kube-api-access-zmxpx\") pod \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\" (UID: \"39df3a48-1eb0-4ffd-b69e-c047588efd4c\") " Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.857322 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39df3a48-1eb0-4ffd-b69e-c047588efd4c-kube-api-access-zmxpx" (OuterVolumeSpecName: "kube-api-access-zmxpx") pod "39df3a48-1eb0-4ffd-b69e-c047588efd4c" (UID: "39df3a48-1eb0-4ffd-b69e-c047588efd4c"). InnerVolumeSpecName "kube-api-access-zmxpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.874395 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39df3a48-1eb0-4ffd-b69e-c047588efd4c" (UID: "39df3a48-1eb0-4ffd-b69e-c047588efd4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.874944 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-config-data" (OuterVolumeSpecName: "config-data") pod "39df3a48-1eb0-4ffd-b69e-c047588efd4c" (UID: "39df3a48-1eb0-4ffd-b69e-c047588efd4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.907807 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.912947 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.919131 4789 generic.go:334] "Generic (PLEG): container finished" podID="39df3a48-1eb0-4ffd-b69e-c047588efd4c" containerID="569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a" exitCode=0 Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.919207 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.919224 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"39df3a48-1eb0-4ffd-b69e-c047588efd4c","Type":"ContainerDied","Data":"569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a"} Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.919261 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"39df3a48-1eb0-4ffd-b69e-c047588efd4c","Type":"ContainerDied","Data":"f4335f25eb6b275f126908394d8bee0e6e011b5a20a32fca03081b21e0c1a4d5"} Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.919281 4789 scope.go:117] "RemoveContainer" containerID="569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a" Dec 16 08:18:40 crc kubenswrapper[4789]: W1216 08:18:40.919645 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79fc14aa_9f78_4ccf_8e35_3a174a8fbd98.slice/crio-f42b26020bfafc8a3db5c86ea2263210ea880564716c22f30ae316668f6c1790 WatchSource:0}: Error finding container f42b26020bfafc8a3db5c86ea2263210ea880564716c22f30ae316668f6c1790: Status 404 returned error can't find the container with id f42b26020bfafc8a3db5c86ea2263210ea880564716c22f30ae316668f6c1790 Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.920223 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2845e7b5-15e3-4e42-951f-cf4383ba6c5e","Type":"ContainerStarted","Data":"31a4f795dc70cf54408686533150fa1695e2073e505e264a0aefc19c5c06b889"} Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.922732 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.945437 4789 scope.go:117] "RemoveContainer" containerID="569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a" Dec 16 08:18:40 crc kubenswrapper[4789]: E1216 08:18:40.946140 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a\": container with ID starting with 569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a not found: ID does not exist" containerID="569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.946177 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a"} err="failed to get container status \"569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a\": rpc error: code = NotFound desc = could not find container \"569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a\": container with ID starting with 569a4764b492abf855a3e5eae492a8331f927ff773aa0943c74e6368b00fc72a not found: ID does not exist" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.961993 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.962024 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39df3a48-1eb0-4ffd-b69e-c047588efd4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.962035 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmxpx\" (UniqueName: \"kubernetes.io/projected/39df3a48-1eb0-4ffd-b69e-c047588efd4c-kube-api-access-zmxpx\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.978094 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.988393 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.998072 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:40 crc kubenswrapper[4789]: E1216 08:18:40.998458 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39df3a48-1eb0-4ffd-b69e-c047588efd4c" containerName="nova-scheduler-scheduler" Dec 16 08:18:40 crc kubenswrapper[4789]: I1216 08:18:40.998471 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="39df3a48-1eb0-4ffd-b69e-c047588efd4c" containerName="nova-scheduler-scheduler" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.003986 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="39df3a48-1eb0-4ffd-b69e-c047588efd4c" containerName="nova-scheduler-scheduler" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.005201 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.007773 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.026235 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.063427 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdmwq\" (UniqueName: \"kubernetes.io/projected/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-kube-api-access-gdmwq\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.063536 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.063566 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-config-data\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.143157 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.165154 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdmwq\" (UniqueName: \"kubernetes.io/projected/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-kube-api-access-gdmwq\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.165293 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.165328 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-config-data\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.179687 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-config-data\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.179738 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.185735 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdmwq\" (UniqueName: \"kubernetes.io/projected/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-kube-api-access-gdmwq\") pod \"nova-scheduler-0\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.243071 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c59c47c-dzbn6"] Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.243350 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerName="dnsmasq-dns" containerID="cri-o://3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e" gracePeriod=10 Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.331557 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.792487 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.882999 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-config\") pod \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.883402 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-nb\") pod \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.883442 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-sb\") pod \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.883494 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-dns-svc\") pod \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.883547 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7t2m\" (UniqueName: \"kubernetes.io/projected/23f4c5ef-0e8d-4939-b577-14b76d2ece57-kube-api-access-p7t2m\") pod \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\" (UID: \"23f4c5ef-0e8d-4939-b577-14b76d2ece57\") " Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.888986 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f4c5ef-0e8d-4939-b577-14b76d2ece57-kube-api-access-p7t2m" (OuterVolumeSpecName: "kube-api-access-p7t2m") pod "23f4c5ef-0e8d-4939-b577-14b76d2ece57" (UID: "23f4c5ef-0e8d-4939-b577-14b76d2ece57"). InnerVolumeSpecName "kube-api-access-p7t2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.930226 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2845e7b5-15e3-4e42-951f-cf4383ba6c5e","Type":"ContainerStarted","Data":"5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11"} Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.930296 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2845e7b5-15e3-4e42-951f-cf4383ba6c5e","Type":"ContainerStarted","Data":"268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120"} Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.931710 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98","Type":"ContainerStarted","Data":"8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493"} Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.931742 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98","Type":"ContainerStarted","Data":"abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab"} Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.931756 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98","Type":"ContainerStarted","Data":"f42b26020bfafc8a3db5c86ea2263210ea880564716c22f30ae316668f6c1790"} Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.934944 4789 generic.go:334] "Generic (PLEG): container finished" podID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerID="3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e" exitCode=0 Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.935004 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.935002 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" event={"ID":"23f4c5ef-0e8d-4939-b577-14b76d2ece57","Type":"ContainerDied","Data":"3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e"} Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.935132 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" event={"ID":"23f4c5ef-0e8d-4939-b577-14b76d2ece57","Type":"ContainerDied","Data":"c05154acfe04cb21850f7ec6dc20c78d93dd21e053817507212400822c7805c1"} Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.935152 4789 scope.go:117] "RemoveContainer" containerID="3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.935804 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.941686 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23f4c5ef-0e8d-4939-b577-14b76d2ece57" (UID: "23f4c5ef-0e8d-4939-b577-14b76d2ece57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.941806 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "23f4c5ef-0e8d-4939-b577-14b76d2ece57" (UID: "23f4c5ef-0e8d-4939-b577-14b76d2ece57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.945393 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.949552 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-config" (OuterVolumeSpecName: "config") pod "23f4c5ef-0e8d-4939-b577-14b76d2ece57" (UID: "23f4c5ef-0e8d-4939-b577-14b76d2ece57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.954855 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "23f4c5ef-0e8d-4939-b577-14b76d2ece57" (UID: "23f4c5ef-0e8d-4939-b577-14b76d2ece57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.961937 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.961903749 podStartE2EDuration="2.961903749s" podCreationTimestamp="2025-12-16 08:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:41.949600348 +0000 UTC m=+5260.211488007" watchObservedRunningTime="2025-12-16 08:18:41.961903749 +0000 UTC m=+5260.223791378" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.975621 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.975603233 podStartE2EDuration="2.975603233s" podCreationTimestamp="2025-12-16 08:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:41.972701622 +0000 UTC m=+5260.234589251" watchObservedRunningTime="2025-12-16 08:18:41.975603233 +0000 UTC m=+5260.237490862" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.981846 4789 scope.go:117] "RemoveContainer" containerID="1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.985549 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.985579 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.985592 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.985604 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f4c5ef-0e8d-4939-b577-14b76d2ece57-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:41 crc kubenswrapper[4789]: I1216 08:18:41.985615 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7t2m\" (UniqueName: \"kubernetes.io/projected/23f4c5ef-0e8d-4939-b577-14b76d2ece57-kube-api-access-p7t2m\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.025849 4789 scope.go:117] "RemoveContainer" containerID="3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e" Dec 16 08:18:42 crc kubenswrapper[4789]: E1216 08:18:42.031284 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e\": container with ID starting with 3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e not found: ID does not exist" containerID="3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e" Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.031344 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e"} err="failed to get container status \"3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e\": rpc error: code = NotFound desc = could not find container \"3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e\": container with ID starting with 3dee94cbe388d0775dd32e8827666449460c2ea3ec0c6f90b7b63d5eb201b22e not found: ID does not exist" Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.031374 4789 scope.go:117] "RemoveContainer" containerID="1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34" Dec 16 08:18:42 crc kubenswrapper[4789]: E1216 08:18:42.031823 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34\": container with ID starting with 1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34 not found: ID does not exist" containerID="1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34" Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.031861 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34"} err="failed to get container status \"1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34\": rpc error: code = NotFound desc = could not find container \"1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34\": container with ID starting with 1c6fb9746559d4e8e9b78ea320b2c23f9de4d48d4a76f0621a8c5abb1e797c34 not found: ID does not exist" Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.122032 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39df3a48-1eb0-4ffd-b69e-c047588efd4c" path="/var/lib/kubelet/pods/39df3a48-1eb0-4ffd-b69e-c047588efd4c/volumes" Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.265247 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c59c47c-dzbn6"] Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.273884 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9c59c47c-dzbn6"] Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.948630 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e","Type":"ContainerStarted","Data":"2af896891eae878b375a56099d5f19334ec2f929f7e761ba4037c8565c913d84"} Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.948728 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e","Type":"ContainerStarted","Data":"11cd08d4a69fda0de93f0111bf4549e822edb6b43fb3d3c9fd9965d904773fe0"} Dec 16 08:18:42 crc kubenswrapper[4789]: I1216 08:18:42.971960 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.971936009 podStartE2EDuration="2.971936009s" podCreationTimestamp="2025-12-16 08:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:42.96336221 +0000 UTC m=+5261.225249859" watchObservedRunningTime="2025-12-16 08:18:42.971936009 +0000 UTC m=+5261.233823638" Dec 16 08:18:44 crc kubenswrapper[4789]: I1216 08:18:44.114675 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" path="/var/lib/kubelet/pods/23f4c5ef-0e8d-4939-b577-14b76d2ece57/volumes" Dec 16 08:18:45 crc kubenswrapper[4789]: I1216 08:18:45.344025 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:18:45 crc kubenswrapper[4789]: I1216 08:18:45.344349 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:18:46 crc kubenswrapper[4789]: I1216 08:18:46.332618 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 08:18:46 crc kubenswrapper[4789]: I1216 08:18:46.772158 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9c59c47c-dzbn6" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.48:5353: i/o timeout" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.377748 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.826383 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vf547"] Dec 16 08:18:48 crc kubenswrapper[4789]: E1216 08:18:48.826818 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerName="init" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.826835 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerName="init" Dec 16 08:18:48 crc kubenswrapper[4789]: E1216 08:18:48.826881 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerName="dnsmasq-dns" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.826890 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerName="dnsmasq-dns" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.827077 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f4c5ef-0e8d-4939-b577-14b76d2ece57" containerName="dnsmasq-dns" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.827645 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.838146 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.838398 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.845288 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vf547"] Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.912297 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-config-data\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.912371 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-scripts\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.912496 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xvd\" (UniqueName: \"kubernetes.io/projected/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-kube-api-access-k4xvd\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:48 crc kubenswrapper[4789]: I1216 08:18:48.912541 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.014470 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-config-data\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.014541 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-scripts\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.014659 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xvd\" (UniqueName: \"kubernetes.io/projected/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-kube-api-access-k4xvd\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.014699 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.020538 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.036440 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-config-data\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.042658 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xvd\" (UniqueName: \"kubernetes.io/projected/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-kube-api-access-k4xvd\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.044774 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-scripts\") pod \"nova-cell1-cell-mapping-vf547\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.149163 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:49 crc kubenswrapper[4789]: W1216 08:18:49.636581 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a39c3d1_ce07_4b49_a2d2_a5a29fe3731e.slice/crio-681d9aeb642c5b82367c3e10ab05e4330386003a41f2bd3473adab2e3874a3ee WatchSource:0}: Error finding container 681d9aeb642c5b82367c3e10ab05e4330386003a41f2bd3473adab2e3874a3ee: Status 404 returned error can't find the container with id 681d9aeb642c5b82367c3e10ab05e4330386003a41f2bd3473adab2e3874a3ee Dec 16 08:18:49 crc kubenswrapper[4789]: I1216 08:18:49.637083 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vf547"] Dec 16 08:18:50 crc kubenswrapper[4789]: I1216 08:18:50.058142 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vf547" event={"ID":"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e","Type":"ContainerStarted","Data":"37e832098602f0dbb8a0559ccd18310279ae4c905118ed7222be7b524dc075a1"} Dec 16 08:18:50 crc kubenswrapper[4789]: I1216 08:18:50.058464 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vf547" event={"ID":"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e","Type":"ContainerStarted","Data":"681d9aeb642c5b82367c3e10ab05e4330386003a41f2bd3473adab2e3874a3ee"} Dec 16 08:18:50 crc kubenswrapper[4789]: I1216 08:18:50.083100 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vf547" podStartSLOduration=2.083078824 podStartE2EDuration="2.083078824s" podCreationTimestamp="2025-12-16 08:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:18:50.07471889 +0000 UTC m=+5268.336606529" watchObservedRunningTime="2025-12-16 08:18:50.083078824 +0000 UTC m=+5268.344966453" Dec 16 08:18:50 crc kubenswrapper[4789]: I1216 08:18:50.323757 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 08:18:50 crc kubenswrapper[4789]: I1216 08:18:50.323824 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 08:18:50 crc kubenswrapper[4789]: I1216 08:18:50.343970 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 08:18:50 crc kubenswrapper[4789]: I1216 08:18:50.344065 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.332522 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.364030 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.489172 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.489212 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.489248 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.489265 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.68:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.928089 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:18:51 crc kubenswrapper[4789]: I1216 08:18:51.928153 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:18:52 crc kubenswrapper[4789]: I1216 08:18:52.103485 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 08:18:54 crc kubenswrapper[4789]: E1216 08:18:54.456530 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a39c3d1_ce07_4b49_a2d2_a5a29fe3731e.slice/crio-37e832098602f0dbb8a0559ccd18310279ae4c905118ed7222be7b524dc075a1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a39c3d1_ce07_4b49_a2d2_a5a29fe3731e.slice/crio-conmon-37e832098602f0dbb8a0559ccd18310279ae4c905118ed7222be7b524dc075a1.scope\": RecentStats: unable to find data in memory cache]" Dec 16 08:18:55 crc kubenswrapper[4789]: I1216 08:18:55.101881 4789 generic.go:334] "Generic (PLEG): container finished" podID="8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" containerID="37e832098602f0dbb8a0559ccd18310279ae4c905118ed7222be7b524dc075a1" exitCode=0 Dec 16 08:18:55 crc kubenswrapper[4789]: I1216 08:18:55.102260 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vf547" event={"ID":"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e","Type":"ContainerDied","Data":"37e832098602f0dbb8a0559ccd18310279ae4c905118ed7222be7b524dc075a1"} Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.465861 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.554314 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-config-data\") pod \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.554366 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-scripts\") pod \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.555986 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-combined-ca-bundle\") pod \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.556090 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4xvd\" (UniqueName: \"kubernetes.io/projected/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-kube-api-access-k4xvd\") pod \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\" (UID: \"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e\") " Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.561560 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-kube-api-access-k4xvd" (OuterVolumeSpecName: "kube-api-access-k4xvd") pod "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" (UID: "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e"). InnerVolumeSpecName "kube-api-access-k4xvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.563750 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-scripts" (OuterVolumeSpecName: "scripts") pod "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" (UID: "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.576890 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-config-data" (OuterVolumeSpecName: "config-data") pod "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" (UID: "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.579167 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" (UID: "8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.658288 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4xvd\" (UniqueName: \"kubernetes.io/projected/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-kube-api-access-k4xvd\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.658345 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.658369 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:56 crc kubenswrapper[4789]: I1216 08:18:56.658384 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.119716 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vf547" event={"ID":"8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e","Type":"ContainerDied","Data":"681d9aeb642c5b82367c3e10ab05e4330386003a41f2bd3473adab2e3874a3ee"} Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.119755 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="681d9aeb642c5b82367c3e10ab05e4330386003a41f2bd3473adab2e3874a3ee" Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.119776 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vf547" Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.301993 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.302276 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" containerName="nova-scheduler-scheduler" containerID="cri-o://2af896891eae878b375a56099d5f19334ec2f929f7e761ba4037c8565c913d84" gracePeriod=30 Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.320595 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.320865 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-log" containerID="cri-o://268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120" gracePeriod=30 Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.320963 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-api" containerID="cri-o://5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11" gracePeriod=30 Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.337568 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.337795 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-log" containerID="cri-o://abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab" gracePeriod=30 Dec 16 08:18:57 crc kubenswrapper[4789]: I1216 08:18:57.338269 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-metadata" containerID="cri-o://8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493" gracePeriod=30 Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.132355 4789 generic.go:334] "Generic (PLEG): container finished" podID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerID="abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab" exitCode=143 Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.132728 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98","Type":"ContainerDied","Data":"abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab"} Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.135476 4789 generic.go:334] "Generic (PLEG): container finished" podID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerID="268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120" exitCode=143 Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.135525 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2845e7b5-15e3-4e42-951f-cf4383ba6c5e","Type":"ContainerDied","Data":"268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120"} Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.147905 4789 generic.go:334] "Generic (PLEG): container finished" podID="672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" containerID="2af896891eae878b375a56099d5f19334ec2f929f7e761ba4037c8565c913d84" exitCode=0 Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.147981 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e","Type":"ContainerDied","Data":"2af896891eae878b375a56099d5f19334ec2f929f7e761ba4037c8565c913d84"} Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.488681 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.591830 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-config-data\") pod \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.592100 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-combined-ca-bundle\") pod \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.592147 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdmwq\" (UniqueName: \"kubernetes.io/projected/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-kube-api-access-gdmwq\") pod \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\" (UID: \"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e\") " Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.611235 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-kube-api-access-gdmwq" (OuterVolumeSpecName: "kube-api-access-gdmwq") pod "672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" (UID: "672b6c39-6ef1-4ca6-aab3-43e5bb6d945e"). InnerVolumeSpecName "kube-api-access-gdmwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.617299 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" (UID: "672b6c39-6ef1-4ca6-aab3-43e5bb6d945e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.646204 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-config-data" (OuterVolumeSpecName: "config-data") pod "672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" (UID: "672b6c39-6ef1-4ca6-aab3-43e5bb6d945e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.694695 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.694733 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdmwq\" (UniqueName: \"kubernetes.io/projected/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-kube-api-access-gdmwq\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:58 crc kubenswrapper[4789]: I1216 08:18:58.694744 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.160895 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"672b6c39-6ef1-4ca6-aab3-43e5bb6d945e","Type":"ContainerDied","Data":"11cd08d4a69fda0de93f0111bf4549e822edb6b43fb3d3c9fd9965d904773fe0"} Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.160975 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.160992 4789 scope.go:117] "RemoveContainer" containerID="2af896891eae878b375a56099d5f19334ec2f929f7e761ba4037c8565c913d84" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.197211 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.203049 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.220627 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:59 crc kubenswrapper[4789]: E1216 08:18:59.221090 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" containerName="nova-manage" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.221112 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" containerName="nova-manage" Dec 16 08:18:59 crc kubenswrapper[4789]: E1216 08:18:59.221134 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" containerName="nova-scheduler-scheduler" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.221144 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" containerName="nova-scheduler-scheduler" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.221376 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" containerName="nova-scheduler-scheduler" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.221398 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" containerName="nova-manage" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.222130 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.224572 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.234606 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.304278 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6c9c\" (UniqueName: \"kubernetes.io/projected/7096ea39-3a7b-4eca-b47b-953462331ae8-kube-api-access-t6c9c\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.304802 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.305125 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-config-data\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.407384 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.407477 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-config-data\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.407572 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6c9c\" (UniqueName: \"kubernetes.io/projected/7096ea39-3a7b-4eca-b47b-953462331ae8-kube-api-access-t6c9c\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.412037 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-config-data\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.417225 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.429280 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6c9c\" (UniqueName: \"kubernetes.io/projected/7096ea39-3a7b-4eca-b47b-953462331ae8-kube-api-access-t6c9c\") pod \"nova-scheduler-0\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.551975 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:18:59 crc kubenswrapper[4789]: I1216 08:18:59.979519 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:18:59 crc kubenswrapper[4789]: W1216 08:18:59.983635 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7096ea39_3a7b_4eca_b47b_953462331ae8.slice/crio-1635cfa2a17948cca46761ea36545dcfe8226ee92aa1272b7df7ceb8872dd220 WatchSource:0}: Error finding container 1635cfa2a17948cca46761ea36545dcfe8226ee92aa1272b7df7ceb8872dd220: Status 404 returned error can't find the container with id 1635cfa2a17948cca46761ea36545dcfe8226ee92aa1272b7df7ceb8872dd220 Dec 16 08:19:00 crc kubenswrapper[4789]: I1216 08:19:00.114481 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="672b6c39-6ef1-4ca6-aab3-43e5bb6d945e" path="/var/lib/kubelet/pods/672b6c39-6ef1-4ca6-aab3-43e5bb6d945e/volumes" Dec 16 08:19:00 crc kubenswrapper[4789]: I1216 08:19:00.171126 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7096ea39-3a7b-4eca-b47b-953462331ae8","Type":"ContainerStarted","Data":"1635cfa2a17948cca46761ea36545dcfe8226ee92aa1272b7df7ceb8872dd220"} Dec 16 08:19:00 crc kubenswrapper[4789]: I1216 08:19:00.940367 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:19:00 crc kubenswrapper[4789]: I1216 08:19:00.945606 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045150 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-config-data\") pod \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045285 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-logs\") pod \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045380 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-logs\") pod \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045401 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-config-data\") pod \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045455 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-combined-ca-bundle\") pod \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045511 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzdhw\" (UniqueName: \"kubernetes.io/projected/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-kube-api-access-rzdhw\") pod \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\" (UID: \"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045563 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-combined-ca-bundle\") pod \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045605 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdttt\" (UniqueName: \"kubernetes.io/projected/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-kube-api-access-hdttt\") pod \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\" (UID: \"2845e7b5-15e3-4e42-951f-cf4383ba6c5e\") " Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.045932 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-logs" (OuterVolumeSpecName: "logs") pod "2845e7b5-15e3-4e42-951f-cf4383ba6c5e" (UID: "2845e7b5-15e3-4e42-951f-cf4383ba6c5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.046126 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.046153 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-logs" (OuterVolumeSpecName: "logs") pod "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" (UID: "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.050146 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-kube-api-access-hdttt" (OuterVolumeSpecName: "kube-api-access-hdttt") pod "2845e7b5-15e3-4e42-951f-cf4383ba6c5e" (UID: "2845e7b5-15e3-4e42-951f-cf4383ba6c5e"). InnerVolumeSpecName "kube-api-access-hdttt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.052323 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-kube-api-access-rzdhw" (OuterVolumeSpecName: "kube-api-access-rzdhw") pod "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" (UID: "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98"). InnerVolumeSpecName "kube-api-access-rzdhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.067438 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" (UID: "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.069315 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-config-data" (OuterVolumeSpecName: "config-data") pod "2845e7b5-15e3-4e42-951f-cf4383ba6c5e" (UID: "2845e7b5-15e3-4e42-951f-cf4383ba6c5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.069487 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-config-data" (OuterVolumeSpecName: "config-data") pod "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" (UID: "79fc14aa-9f78-4ccf-8e35-3a174a8fbd98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.070749 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2845e7b5-15e3-4e42-951f-cf4383ba6c5e" (UID: "2845e7b5-15e3-4e42-951f-cf4383ba6c5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.148356 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.148417 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.148432 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzdhw\" (UniqueName: \"kubernetes.io/projected/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-kube-api-access-rzdhw\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.148445 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.148481 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdttt\" (UniqueName: \"kubernetes.io/projected/2845e7b5-15e3-4e42-951f-cf4383ba6c5e-kube-api-access-hdttt\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.148495 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.148505 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.183756 4789 generic.go:334] "Generic (PLEG): container finished" podID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerID="8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493" exitCode=0 Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.183812 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.184320 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98","Type":"ContainerDied","Data":"8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493"} Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.184770 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79fc14aa-9f78-4ccf-8e35-3a174a8fbd98","Type":"ContainerDied","Data":"f42b26020bfafc8a3db5c86ea2263210ea880564716c22f30ae316668f6c1790"} Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.184882 4789 scope.go:117] "RemoveContainer" containerID="8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.186437 4789 generic.go:334] "Generic (PLEG): container finished" podID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerID="5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11" exitCode=0 Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.186554 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2845e7b5-15e3-4e42-951f-cf4383ba6c5e","Type":"ContainerDied","Data":"5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11"} Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.186638 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2845e7b5-15e3-4e42-951f-cf4383ba6c5e","Type":"ContainerDied","Data":"31a4f795dc70cf54408686533150fa1695e2073e505e264a0aefc19c5c06b889"} Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.186761 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.200845 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7096ea39-3a7b-4eca-b47b-953462331ae8","Type":"ContainerStarted","Data":"90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792"} Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.212992 4789 scope.go:117] "RemoveContainer" containerID="abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.225790 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.225775361 podStartE2EDuration="2.225775361s" podCreationTimestamp="2025-12-16 08:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:19:01.217617152 +0000 UTC m=+5279.479504771" watchObservedRunningTime="2025-12-16 08:19:01.225775361 +0000 UTC m=+5279.487662990" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.242147 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.245479 4789 scope.go:117] "RemoveContainer" containerID="8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493" Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.247145 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493\": container with ID starting with 8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493 not found: ID does not exist" containerID="8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.247228 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493"} err="failed to get container status \"8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493\": rpc error: code = NotFound desc = could not find container \"8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493\": container with ID starting with 8285a1547f3d3bafbf7ea9fadc69b49eb377a5627eb70378b389fbe532918493 not found: ID does not exist" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.247299 4789 scope.go:117] "RemoveContainer" containerID="abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab" Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.247881 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab\": container with ID starting with abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab not found: ID does not exist" containerID="abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.247978 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab"} err="failed to get container status \"abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab\": rpc error: code = NotFound desc = could not find container \"abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab\": container with ID starting with abafacd59b1efada538db569a8745118fa462411ab05f43bb986c90ed92667ab not found: ID does not exist" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.248071 4789 scope.go:117] "RemoveContainer" containerID="5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.254961 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.266493 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.280083 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.287879 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.288307 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-log" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288325 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-log" Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.288335 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-log" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288343 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-log" Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.288370 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-api" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288376 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-api" Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.288382 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-metadata" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288388 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-metadata" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288560 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-metadata" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288584 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-log" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288599 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" containerName="nova-api-api" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288608 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" containerName="nova-metadata-log" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.288653 4789 scope.go:117] "RemoveContainer" containerID="268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.289587 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.295252 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.297040 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.305261 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.308555 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.310026 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.319447 4789 scope.go:117] "RemoveContainer" containerID="5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.319569 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.320645 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11\": container with ID starting with 5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11 not found: ID does not exist" containerID="5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.320691 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11"} err="failed to get container status \"5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11\": rpc error: code = NotFound desc = could not find container \"5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11\": container with ID starting with 5542c3d45433a897938c2728b8338c17d7e5ee9db18da9e0408d802c9c370f11 not found: ID does not exist" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.320709 4789 scope.go:117] "RemoveContainer" containerID="268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120" Dec 16 08:19:01 crc kubenswrapper[4789]: E1216 08:19:01.320972 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120\": container with ID starting with 268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120 not found: ID does not exist" containerID="268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.321016 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120"} err="failed to get container status \"268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120\": rpc error: code = NotFound desc = could not find container \"268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120\": container with ID starting with 268ec22fdb0ff64a257eed3e90252ed67d80393506ff5a383befe00fe8575120 not found: ID does not exist" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.358449 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgsx\" (UniqueName: \"kubernetes.io/projected/0a1a792a-ba73-4aa6-ad00-a15960cdecef-kube-api-access-hjgsx\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.358707 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-logs\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.358746 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.358863 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.358932 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-config-data\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.359264 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-config-data\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.359342 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a792a-ba73-4aa6-ad00-a15960cdecef-logs\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.359412 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6bwq\" (UniqueName: \"kubernetes.io/projected/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-kube-api-access-t6bwq\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.460739 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-config-data\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.460798 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a792a-ba73-4aa6-ad00-a15960cdecef-logs\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.460837 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6bwq\" (UniqueName: \"kubernetes.io/projected/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-kube-api-access-t6bwq\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.460889 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgsx\" (UniqueName: \"kubernetes.io/projected/0a1a792a-ba73-4aa6-ad00-a15960cdecef-kube-api-access-hjgsx\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.460960 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-logs\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.460995 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.461031 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.461059 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-config-data\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.461832 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-logs\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.463634 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a792a-ba73-4aa6-ad00-a15960cdecef-logs\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.465432 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.466156 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-config-data\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.467331 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-config-data\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.469991 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.485180 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6bwq\" (UniqueName: \"kubernetes.io/projected/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-kube-api-access-t6bwq\") pod \"nova-metadata-0\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.485534 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgsx\" (UniqueName: \"kubernetes.io/projected/0a1a792a-ba73-4aa6-ad00-a15960cdecef-kube-api-access-hjgsx\") pod \"nova-api-0\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.614059 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.629540 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:19:01 crc kubenswrapper[4789]: I1216 08:19:01.938811 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:19:02 crc kubenswrapper[4789]: I1216 08:19:02.083517 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:19:02 crc kubenswrapper[4789]: W1216 08:19:02.092807 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a1a792a_ba73_4aa6_ad00_a15960cdecef.slice/crio-7b37cb619fec967c402869ba4ff7d5a4dc9e2c69a1633b57743b2e8a3680ea99 WatchSource:0}: Error finding container 7b37cb619fec967c402869ba4ff7d5a4dc9e2c69a1633b57743b2e8a3680ea99: Status 404 returned error can't find the container with id 7b37cb619fec967c402869ba4ff7d5a4dc9e2c69a1633b57743b2e8a3680ea99 Dec 16 08:19:02 crc kubenswrapper[4789]: I1216 08:19:02.153539 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2845e7b5-15e3-4e42-951f-cf4383ba6c5e" path="/var/lib/kubelet/pods/2845e7b5-15e3-4e42-951f-cf4383ba6c5e/volumes" Dec 16 08:19:02 crc kubenswrapper[4789]: I1216 08:19:02.157543 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fc14aa-9f78-4ccf-8e35-3a174a8fbd98" path="/var/lib/kubelet/pods/79fc14aa-9f78-4ccf-8e35-3a174a8fbd98/volumes" Dec 16 08:19:02 crc kubenswrapper[4789]: I1216 08:19:02.224314 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a","Type":"ContainerStarted","Data":"8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04"} Dec 16 08:19:02 crc kubenswrapper[4789]: I1216 08:19:02.224828 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a","Type":"ContainerStarted","Data":"8e37fdb9d882daec8f22322c8a895197b75f2923905335ac654b9e505beaedb1"} Dec 16 08:19:02 crc kubenswrapper[4789]: I1216 08:19:02.225751 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a1a792a-ba73-4aa6-ad00-a15960cdecef","Type":"ContainerStarted","Data":"7b37cb619fec967c402869ba4ff7d5a4dc9e2c69a1633b57743b2e8a3680ea99"} Dec 16 08:19:03 crc kubenswrapper[4789]: I1216 08:19:03.236444 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a1a792a-ba73-4aa6-ad00-a15960cdecef","Type":"ContainerStarted","Data":"5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98"} Dec 16 08:19:03 crc kubenswrapper[4789]: I1216 08:19:03.236869 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a1a792a-ba73-4aa6-ad00-a15960cdecef","Type":"ContainerStarted","Data":"86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494"} Dec 16 08:19:03 crc kubenswrapper[4789]: I1216 08:19:03.239658 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a","Type":"ContainerStarted","Data":"6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8"} Dec 16 08:19:03 crc kubenswrapper[4789]: I1216 08:19:03.261209 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.261192588 podStartE2EDuration="2.261192588s" podCreationTimestamp="2025-12-16 08:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:19:03.256179745 +0000 UTC m=+5281.518067374" watchObservedRunningTime="2025-12-16 08:19:03.261192588 +0000 UTC m=+5281.523080217" Dec 16 08:19:03 crc kubenswrapper[4789]: I1216 08:19:03.280056 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.280034028 podStartE2EDuration="2.280034028s" podCreationTimestamp="2025-12-16 08:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:19:03.27230161 +0000 UTC m=+5281.534189259" watchObservedRunningTime="2025-12-16 08:19:03.280034028 +0000 UTC m=+5281.541921657" Dec 16 08:19:04 crc kubenswrapper[4789]: I1216 08:19:04.552040 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 08:19:06 crc kubenswrapper[4789]: I1216 08:19:06.630165 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:19:06 crc kubenswrapper[4789]: I1216 08:19:06.630496 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:19:09 crc kubenswrapper[4789]: I1216 08:19:09.552668 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 08:19:09 crc kubenswrapper[4789]: I1216 08:19:09.575518 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 08:19:10 crc kubenswrapper[4789]: I1216 08:19:10.338201 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 08:19:11 crc kubenswrapper[4789]: I1216 08:19:11.615136 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 08:19:11 crc kubenswrapper[4789]: I1216 08:19:11.615236 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 08:19:11 crc kubenswrapper[4789]: I1216 08:19:11.630559 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 08:19:11 crc kubenswrapper[4789]: I1216 08:19:11.630626 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 08:19:12 crc kubenswrapper[4789]: I1216 08:19:12.780177 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:19:12 crc kubenswrapper[4789]: I1216 08:19:12.780219 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:19:12 crc kubenswrapper[4789]: I1216 08:19:12.780232 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:19:12 crc kubenswrapper[4789]: I1216 08:19:12.780241 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.618281 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.619643 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.622352 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.624412 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.637499 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.640255 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.645599 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.928700 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.928841 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.929041 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.930470 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:19:21 crc kubenswrapper[4789]: I1216 08:19:21.930564 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" gracePeriod=600 Dec 16 08:19:22 crc kubenswrapper[4789]: E1216 08:19:22.054300 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.411450 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" exitCode=0 Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.411477 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987"} Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.411513 4789 scope.go:117] "RemoveContainer" containerID="263dc2616d80f4ff03c2ff79d40529ccbb3a132a477a0c3fb859bf304de469fc" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.412171 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.412308 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 08:19:22 crc kubenswrapper[4789]: E1216 08:19:22.412617 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.413865 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.416660 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.652417 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d8668765-x94jg"] Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.654464 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.660383 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-config\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.660524 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.660567 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7fg\" (UniqueName: \"kubernetes.io/projected/6e45f49a-ddb6-4f06-83b7-3492b38128ce-kube-api-access-5x7fg\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.660642 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.660692 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-dns-svc\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.663279 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d8668765-x94jg"] Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.762319 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-config\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.762405 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.762431 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7fg\" (UniqueName: \"kubernetes.io/projected/6e45f49a-ddb6-4f06-83b7-3492b38128ce-kube-api-access-5x7fg\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.762478 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.762513 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-dns-svc\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.763661 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.763708 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-dns-svc\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.764316 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-config\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.764730 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.802581 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x7fg\" (UniqueName: \"kubernetes.io/projected/6e45f49a-ddb6-4f06-83b7-3492b38128ce-kube-api-access-5x7fg\") pod \"dnsmasq-dns-7d8668765-x94jg\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:22 crc kubenswrapper[4789]: I1216 08:19:22.970655 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:23 crc kubenswrapper[4789]: I1216 08:19:23.416225 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d8668765-x94jg"] Dec 16 08:19:23 crc kubenswrapper[4789]: W1216 08:19:23.424944 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e45f49a_ddb6_4f06_83b7_3492b38128ce.slice/crio-75b34cccb9e89cf9b6a450d6b3598d7a8bdcebd2d5bfbe92cbc4a0c066cc53cf WatchSource:0}: Error finding container 75b34cccb9e89cf9b6a450d6b3598d7a8bdcebd2d5bfbe92cbc4a0c066cc53cf: Status 404 returned error can't find the container with id 75b34cccb9e89cf9b6a450d6b3598d7a8bdcebd2d5bfbe92cbc4a0c066cc53cf Dec 16 08:19:24 crc kubenswrapper[4789]: I1216 08:19:24.433768 4789 generic.go:334] "Generic (PLEG): container finished" podID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerID="79c140cdd409de11a720345fc5bf860956f4328bde37ab51c6a4162a85832052" exitCode=0 Dec 16 08:19:24 crc kubenswrapper[4789]: I1216 08:19:24.433897 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8668765-x94jg" event={"ID":"6e45f49a-ddb6-4f06-83b7-3492b38128ce","Type":"ContainerDied","Data":"79c140cdd409de11a720345fc5bf860956f4328bde37ab51c6a4162a85832052"} Dec 16 08:19:24 crc kubenswrapper[4789]: I1216 08:19:24.434356 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8668765-x94jg" event={"ID":"6e45f49a-ddb6-4f06-83b7-3492b38128ce","Type":"ContainerStarted","Data":"75b34cccb9e89cf9b6a450d6b3598d7a8bdcebd2d5bfbe92cbc4a0c066cc53cf"} Dec 16 08:19:25 crc kubenswrapper[4789]: I1216 08:19:25.442929 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8668765-x94jg" event={"ID":"6e45f49a-ddb6-4f06-83b7-3492b38128ce","Type":"ContainerStarted","Data":"5a51d5d3e116521c9c6719588539232c77dabed6d40d1d88710cf84db8d2ebcf"} Dec 16 08:19:25 crc kubenswrapper[4789]: I1216 08:19:25.443461 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:25 crc kubenswrapper[4789]: I1216 08:19:25.466115 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d8668765-x94jg" podStartSLOduration=3.466095705 podStartE2EDuration="3.466095705s" podCreationTimestamp="2025-12-16 08:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:19:25.459090094 +0000 UTC m=+5303.720977733" watchObservedRunningTime="2025-12-16 08:19:25.466095705 +0000 UTC m=+5303.727983334" Dec 16 08:19:32 crc kubenswrapper[4789]: I1216 08:19:32.973073 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.045944 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7db745fdc9-fvwzc"] Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.046748 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" podUID="00ebde91-a24c-4979-a38c-b69318f1a615" containerName="dnsmasq-dns" containerID="cri-o://e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4" gracePeriod=10 Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.501946 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.514117 4789 generic.go:334] "Generic (PLEG): container finished" podID="00ebde91-a24c-4979-a38c-b69318f1a615" containerID="e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4" exitCode=0 Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.514168 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.514171 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" event={"ID":"00ebde91-a24c-4979-a38c-b69318f1a615","Type":"ContainerDied","Data":"e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4"} Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.514315 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db745fdc9-fvwzc" event={"ID":"00ebde91-a24c-4979-a38c-b69318f1a615","Type":"ContainerDied","Data":"4e905d8f2e0e09272c1667add69458a6bd100021f438503b719b88afae0af989"} Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.514343 4789 scope.go:117] "RemoveContainer" containerID="e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.549936 4789 scope.go:117] "RemoveContainer" containerID="e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.575115 4789 scope.go:117] "RemoveContainer" containerID="e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4" Dec 16 08:19:33 crc kubenswrapper[4789]: E1216 08:19:33.575726 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4\": container with ID starting with e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4 not found: ID does not exist" containerID="e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.575758 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4"} err="failed to get container status \"e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4\": rpc error: code = NotFound desc = could not find container \"e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4\": container with ID starting with e5f7bf2d07bf79d7328e18a3b2cb1901cbd2d65c110966c722a8a893acf701d4 not found: ID does not exist" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.575781 4789 scope.go:117] "RemoveContainer" containerID="e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988" Dec 16 08:19:33 crc kubenswrapper[4789]: E1216 08:19:33.576042 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988\": container with ID starting with e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988 not found: ID does not exist" containerID="e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.576114 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988"} err="failed to get container status \"e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988\": rpc error: code = NotFound desc = could not find container \"e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988\": container with ID starting with e5a7f9c2c76800b7b726cb3b0f5ba68ec3d28d8029106f33c33bc01c19e40988 not found: ID does not exist" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.674523 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-nb\") pod \"00ebde91-a24c-4979-a38c-b69318f1a615\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.674695 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-sb\") pod \"00ebde91-a24c-4979-a38c-b69318f1a615\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.674767 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hl7h\" (UniqueName: \"kubernetes.io/projected/00ebde91-a24c-4979-a38c-b69318f1a615-kube-api-access-5hl7h\") pod \"00ebde91-a24c-4979-a38c-b69318f1a615\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.674792 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-dns-svc\") pod \"00ebde91-a24c-4979-a38c-b69318f1a615\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.674826 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-config\") pod \"00ebde91-a24c-4979-a38c-b69318f1a615\" (UID: \"00ebde91-a24c-4979-a38c-b69318f1a615\") " Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.680350 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ebde91-a24c-4979-a38c-b69318f1a615-kube-api-access-5hl7h" (OuterVolumeSpecName: "kube-api-access-5hl7h") pod "00ebde91-a24c-4979-a38c-b69318f1a615" (UID: "00ebde91-a24c-4979-a38c-b69318f1a615"). InnerVolumeSpecName "kube-api-access-5hl7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.715792 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-config" (OuterVolumeSpecName: "config") pod "00ebde91-a24c-4979-a38c-b69318f1a615" (UID: "00ebde91-a24c-4979-a38c-b69318f1a615"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.718462 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00ebde91-a24c-4979-a38c-b69318f1a615" (UID: "00ebde91-a24c-4979-a38c-b69318f1a615"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.719403 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00ebde91-a24c-4979-a38c-b69318f1a615" (UID: "00ebde91-a24c-4979-a38c-b69318f1a615"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.720955 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00ebde91-a24c-4979-a38c-b69318f1a615" (UID: "00ebde91-a24c-4979-a38c-b69318f1a615"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.776468 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hl7h\" (UniqueName: \"kubernetes.io/projected/00ebde91-a24c-4979-a38c-b69318f1a615-kube-api-access-5hl7h\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.776498 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.776506 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.776515 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.776523 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ebde91-a24c-4979-a38c-b69318f1a615-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.864747 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7db745fdc9-fvwzc"] Dec 16 08:19:33 crc kubenswrapper[4789]: I1216 08:19:33.874390 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7db745fdc9-fvwzc"] Dec 16 08:19:34 crc kubenswrapper[4789]: I1216 08:19:34.114436 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ebde91-a24c-4979-a38c-b69318f1a615" path="/var/lib/kubelet/pods/00ebde91-a24c-4979-a38c-b69318f1a615/volumes" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.174120 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mzk6c"] Dec 16 08:19:35 crc kubenswrapper[4789]: E1216 08:19:35.174951 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ebde91-a24c-4979-a38c-b69318f1a615" containerName="init" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.174968 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ebde91-a24c-4979-a38c-b69318f1a615" containerName="init" Dec 16 08:19:35 crc kubenswrapper[4789]: E1216 08:19:35.174997 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ebde91-a24c-4979-a38c-b69318f1a615" containerName="dnsmasq-dns" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.175007 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ebde91-a24c-4979-a38c-b69318f1a615" containerName="dnsmasq-dns" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.175205 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ebde91-a24c-4979-a38c-b69318f1a615" containerName="dnsmasq-dns" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.176044 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.214381 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fd42-account-create-update-4ttgv"] Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.215622 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.219358 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.225239 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mzk6c"] Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.239340 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fd42-account-create-update-4ttgv"] Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.304268 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673468d-5701-40b7-870a-fb32e6c9d60a-operator-scripts\") pod \"cinder-fd42-account-create-update-4ttgv\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.304550 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twr26\" (UniqueName: \"kubernetes.io/projected/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-kube-api-access-twr26\") pod \"cinder-db-create-mzk6c\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.304676 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tlh5\" (UniqueName: \"kubernetes.io/projected/5673468d-5701-40b7-870a-fb32e6c9d60a-kube-api-access-5tlh5\") pod \"cinder-fd42-account-create-update-4ttgv\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.304828 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-operator-scripts\") pod \"cinder-db-create-mzk6c\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.406149 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673468d-5701-40b7-870a-fb32e6c9d60a-operator-scripts\") pod \"cinder-fd42-account-create-update-4ttgv\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.406223 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twr26\" (UniqueName: \"kubernetes.io/projected/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-kube-api-access-twr26\") pod \"cinder-db-create-mzk6c\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.406264 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tlh5\" (UniqueName: \"kubernetes.io/projected/5673468d-5701-40b7-870a-fb32e6c9d60a-kube-api-access-5tlh5\") pod \"cinder-fd42-account-create-update-4ttgv\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.406331 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-operator-scripts\") pod \"cinder-db-create-mzk6c\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.406969 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673468d-5701-40b7-870a-fb32e6c9d60a-operator-scripts\") pod \"cinder-fd42-account-create-update-4ttgv\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.407032 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-operator-scripts\") pod \"cinder-db-create-mzk6c\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.423622 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tlh5\" (UniqueName: \"kubernetes.io/projected/5673468d-5701-40b7-870a-fb32e6c9d60a-kube-api-access-5tlh5\") pod \"cinder-fd42-account-create-update-4ttgv\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.430516 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twr26\" (UniqueName: \"kubernetes.io/projected/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-kube-api-access-twr26\") pod \"cinder-db-create-mzk6c\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.495725 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.533240 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:35 crc kubenswrapper[4789]: I1216 08:19:35.933901 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mzk6c"] Dec 16 08:19:36 crc kubenswrapper[4789]: I1216 08:19:36.039447 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fd42-account-create-update-4ttgv"] Dec 16 08:19:36 crc kubenswrapper[4789]: W1216 08:19:36.046851 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5673468d_5701_40b7_870a_fb32e6c9d60a.slice/crio-9ff281b7713345989c94d9bf2e303b7eb5653d5a00e000b262543e6e730173f3 WatchSource:0}: Error finding container 9ff281b7713345989c94d9bf2e303b7eb5653d5a00e000b262543e6e730173f3: Status 404 returned error can't find the container with id 9ff281b7713345989c94d9bf2e303b7eb5653d5a00e000b262543e6e730173f3 Dec 16 08:19:36 crc kubenswrapper[4789]: I1216 08:19:36.542435 4789 generic.go:334] "Generic (PLEG): container finished" podID="9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7" containerID="895267cca919ebefdbde50b4d237657a7090e3451c3d21a98e1a922f104a929f" exitCode=0 Dec 16 08:19:36 crc kubenswrapper[4789]: I1216 08:19:36.542498 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mzk6c" event={"ID":"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7","Type":"ContainerDied","Data":"895267cca919ebefdbde50b4d237657a7090e3451c3d21a98e1a922f104a929f"} Dec 16 08:19:36 crc kubenswrapper[4789]: I1216 08:19:36.542868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mzk6c" event={"ID":"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7","Type":"ContainerStarted","Data":"c65fc5b8d257c2203df358773145594d78e457226d0a0ddd19aba8e7ee87d7a4"} Dec 16 08:19:36 crc kubenswrapper[4789]: I1216 08:19:36.544294 4789 generic.go:334] "Generic (PLEG): container finished" podID="5673468d-5701-40b7-870a-fb32e6c9d60a" containerID="12886505534ad285dea0ad937195d6b4c00f2e06f16c2723907d994a1152137f" exitCode=0 Dec 16 08:19:36 crc kubenswrapper[4789]: I1216 08:19:36.544327 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fd42-account-create-update-4ttgv" event={"ID":"5673468d-5701-40b7-870a-fb32e6c9d60a","Type":"ContainerDied","Data":"12886505534ad285dea0ad937195d6b4c00f2e06f16c2723907d994a1152137f"} Dec 16 08:19:36 crc kubenswrapper[4789]: I1216 08:19:36.544362 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fd42-account-create-update-4ttgv" event={"ID":"5673468d-5701-40b7-870a-fb32e6c9d60a","Type":"ContainerStarted","Data":"9ff281b7713345989c94d9bf2e303b7eb5653d5a00e000b262543e6e730173f3"} Dec 16 08:19:37 crc kubenswrapper[4789]: I1216 08:19:37.105821 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:19:37 crc kubenswrapper[4789]: E1216 08:19:37.106109 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:19:37 crc kubenswrapper[4789]: I1216 08:19:37.969946 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:37 crc kubenswrapper[4789]: I1216 08:19:37.976732 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.163019 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-operator-scripts\") pod \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.163120 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673468d-5701-40b7-870a-fb32e6c9d60a-operator-scripts\") pod \"5673468d-5701-40b7-870a-fb32e6c9d60a\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.163141 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twr26\" (UniqueName: \"kubernetes.io/projected/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-kube-api-access-twr26\") pod \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\" (UID: \"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7\") " Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.163286 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tlh5\" (UniqueName: \"kubernetes.io/projected/5673468d-5701-40b7-870a-fb32e6c9d60a-kube-api-access-5tlh5\") pod \"5673468d-5701-40b7-870a-fb32e6c9d60a\" (UID: \"5673468d-5701-40b7-870a-fb32e6c9d60a\") " Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.163795 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673468d-5701-40b7-870a-fb32e6c9d60a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5673468d-5701-40b7-870a-fb32e6c9d60a" (UID: "5673468d-5701-40b7-870a-fb32e6c9d60a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.163879 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7" (UID: "9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.168707 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5673468d-5701-40b7-870a-fb32e6c9d60a-kube-api-access-5tlh5" (OuterVolumeSpecName: "kube-api-access-5tlh5") pod "5673468d-5701-40b7-870a-fb32e6c9d60a" (UID: "5673468d-5701-40b7-870a-fb32e6c9d60a"). InnerVolumeSpecName "kube-api-access-5tlh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.168757 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-kube-api-access-twr26" (OuterVolumeSpecName: "kube-api-access-twr26") pod "9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7" (UID: "9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7"). InnerVolumeSpecName "kube-api-access-twr26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.265313 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tlh5\" (UniqueName: \"kubernetes.io/projected/5673468d-5701-40b7-870a-fb32e6c9d60a-kube-api-access-5tlh5\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.265352 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.265362 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673468d-5701-40b7-870a-fb32e6c9d60a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.265370 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twr26\" (UniqueName: \"kubernetes.io/projected/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7-kube-api-access-twr26\") on node \"crc\" DevicePath \"\"" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.563328 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mzk6c" event={"ID":"9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7","Type":"ContainerDied","Data":"c65fc5b8d257c2203df358773145594d78e457226d0a0ddd19aba8e7ee87d7a4"} Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.563382 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c65fc5b8d257c2203df358773145594d78e457226d0a0ddd19aba8e7ee87d7a4" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.563324 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mzk6c" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.564823 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fd42-account-create-update-4ttgv" event={"ID":"5673468d-5701-40b7-870a-fb32e6c9d60a","Type":"ContainerDied","Data":"9ff281b7713345989c94d9bf2e303b7eb5653d5a00e000b262543e6e730173f3"} Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.564851 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ff281b7713345989c94d9bf2e303b7eb5653d5a00e000b262543e6e730173f3" Dec 16 08:19:38 crc kubenswrapper[4789]: I1216 08:19:38.564888 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fd42-account-create-update-4ttgv" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.508854 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-m7qnf"] Dec 16 08:19:40 crc kubenswrapper[4789]: E1216 08:19:40.509472 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673468d-5701-40b7-870a-fb32e6c9d60a" containerName="mariadb-account-create-update" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.509485 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673468d-5701-40b7-870a-fb32e6c9d60a" containerName="mariadb-account-create-update" Dec 16 08:19:40 crc kubenswrapper[4789]: E1216 08:19:40.509523 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7" containerName="mariadb-database-create" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.509529 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7" containerName="mariadb-database-create" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.509687 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7" containerName="mariadb-database-create" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.509709 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5673468d-5701-40b7-870a-fb32e6c9d60a" containerName="mariadb-account-create-update" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.510339 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.514036 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.514036 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nlmz9" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.514123 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.534327 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m7qnf"] Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.706806 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/011d39c2-528c-42f0-8a97-5e3e06caa1c0-etc-machine-id\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.706856 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-combined-ca-bundle\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.707082 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-db-sync-config-data\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.707166 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm2xr\" (UniqueName: \"kubernetes.io/projected/011d39c2-528c-42f0-8a97-5e3e06caa1c0-kube-api-access-sm2xr\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.707343 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-scripts\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.707393 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-config-data\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.809843 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm2xr\" (UniqueName: \"kubernetes.io/projected/011d39c2-528c-42f0-8a97-5e3e06caa1c0-kube-api-access-sm2xr\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.809980 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-scripts\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.810015 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-config-data\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.810040 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/011d39c2-528c-42f0-8a97-5e3e06caa1c0-etc-machine-id\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.810068 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-combined-ca-bundle\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.810143 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-db-sync-config-data\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.810487 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/011d39c2-528c-42f0-8a97-5e3e06caa1c0-etc-machine-id\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.817304 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-scripts\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.818660 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-db-sync-config-data\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.820246 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-config-data\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.821730 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-combined-ca-bundle\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.827099 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm2xr\" (UniqueName: \"kubernetes.io/projected/011d39c2-528c-42f0-8a97-5e3e06caa1c0-kube-api-access-sm2xr\") pod \"cinder-db-sync-m7qnf\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:40 crc kubenswrapper[4789]: I1216 08:19:40.834625 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:19:41 crc kubenswrapper[4789]: I1216 08:19:41.292524 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m7qnf"] Dec 16 08:19:41 crc kubenswrapper[4789]: W1216 08:19:41.294108 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod011d39c2_528c_42f0_8a97_5e3e06caa1c0.slice/crio-29d05fd9ca380b6930a7765acaf50d25f38131bea10be6b88f39edeb91a81699 WatchSource:0}: Error finding container 29d05fd9ca380b6930a7765acaf50d25f38131bea10be6b88f39edeb91a81699: Status 404 returned error can't find the container with id 29d05fd9ca380b6930a7765acaf50d25f38131bea10be6b88f39edeb91a81699 Dec 16 08:19:41 crc kubenswrapper[4789]: I1216 08:19:41.599780 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7qnf" event={"ID":"011d39c2-528c-42f0-8a97-5e3e06caa1c0","Type":"ContainerStarted","Data":"29d05fd9ca380b6930a7765acaf50d25f38131bea10be6b88f39edeb91a81699"} Dec 16 08:19:51 crc kubenswrapper[4789]: I1216 08:19:51.105848 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:19:51 crc kubenswrapper[4789]: E1216 08:19:51.107388 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:20:02 crc kubenswrapper[4789]: E1216 08:20:02.058867 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:20:02 crc kubenswrapper[4789]: E1216 08:20:02.059423 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 08:20:02 crc kubenswrapper[4789]: E1216 08:20:02.059533 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sm2xr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-m7qnf_openstack(011d39c2-528c-42f0-8a97-5e3e06caa1c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 08:20:02 crc kubenswrapper[4789]: E1216 08:20:02.060714 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-m7qnf" podUID="011d39c2-528c-42f0-8a97-5e3e06caa1c0" Dec 16 08:20:02 crc kubenswrapper[4789]: E1216 08:20:02.777106 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/cinder-db-sync-m7qnf" podUID="011d39c2-528c-42f0-8a97-5e3e06caa1c0" Dec 16 08:20:03 crc kubenswrapper[4789]: I1216 08:20:03.104854 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:20:03 crc kubenswrapper[4789]: E1216 08:20:03.105130 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:20:16 crc kubenswrapper[4789]: I1216 08:20:16.923868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7qnf" event={"ID":"011d39c2-528c-42f0-8a97-5e3e06caa1c0","Type":"ContainerStarted","Data":"255a38ae88936dc2f3cffa4a9fcd2c9d567efcb4527cd4fe51a755020b694393"} Dec 16 08:20:16 crc kubenswrapper[4789]: I1216 08:20:16.958578 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-m7qnf" podStartSLOduration=2.947868395 podStartE2EDuration="36.95855576s" podCreationTimestamp="2025-12-16 08:19:40 +0000 UTC" firstStartedPulling="2025-12-16 08:19:41.296546517 +0000 UTC m=+5319.558434146" lastFinishedPulling="2025-12-16 08:20:15.307233882 +0000 UTC m=+5353.569121511" observedRunningTime="2025-12-16 08:20:16.947076739 +0000 UTC m=+5355.208964368" watchObservedRunningTime="2025-12-16 08:20:16.95855576 +0000 UTC m=+5355.220443389" Dec 16 08:20:17 crc kubenswrapper[4789]: I1216 08:20:17.104695 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:20:17 crc kubenswrapper[4789]: E1216 08:20:17.105123 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:20:18 crc kubenswrapper[4789]: I1216 08:20:18.948680 4789 generic.go:334] "Generic (PLEG): container finished" podID="011d39c2-528c-42f0-8a97-5e3e06caa1c0" containerID="255a38ae88936dc2f3cffa4a9fcd2c9d567efcb4527cd4fe51a755020b694393" exitCode=0 Dec 16 08:20:18 crc kubenswrapper[4789]: I1216 08:20:18.948758 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7qnf" event={"ID":"011d39c2-528c-42f0-8a97-5e3e06caa1c0","Type":"ContainerDied","Data":"255a38ae88936dc2f3cffa4a9fcd2c9d567efcb4527cd4fe51a755020b694393"} Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.303811 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.371629 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-combined-ca-bundle\") pod \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.371698 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/011d39c2-528c-42f0-8a97-5e3e06caa1c0-etc-machine-id\") pod \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.371803 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-config-data\") pod \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.371828 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm2xr\" (UniqueName: \"kubernetes.io/projected/011d39c2-528c-42f0-8a97-5e3e06caa1c0-kube-api-access-sm2xr\") pod \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.371983 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-scripts\") pod \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.372012 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-db-sync-config-data\") pod \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\" (UID: \"011d39c2-528c-42f0-8a97-5e3e06caa1c0\") " Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.382494 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011d39c2-528c-42f0-8a97-5e3e06caa1c0-kube-api-access-sm2xr" (OuterVolumeSpecName: "kube-api-access-sm2xr") pod "011d39c2-528c-42f0-8a97-5e3e06caa1c0" (UID: "011d39c2-528c-42f0-8a97-5e3e06caa1c0"). InnerVolumeSpecName "kube-api-access-sm2xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.382880 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "011d39c2-528c-42f0-8a97-5e3e06caa1c0" (UID: "011d39c2-528c-42f0-8a97-5e3e06caa1c0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.383507 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/011d39c2-528c-42f0-8a97-5e3e06caa1c0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "011d39c2-528c-42f0-8a97-5e3e06caa1c0" (UID: "011d39c2-528c-42f0-8a97-5e3e06caa1c0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.384257 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-scripts" (OuterVolumeSpecName: "scripts") pod "011d39c2-528c-42f0-8a97-5e3e06caa1c0" (UID: "011d39c2-528c-42f0-8a97-5e3e06caa1c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.405307 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "011d39c2-528c-42f0-8a97-5e3e06caa1c0" (UID: "011d39c2-528c-42f0-8a97-5e3e06caa1c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.420118 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-config-data" (OuterVolumeSpecName: "config-data") pod "011d39c2-528c-42f0-8a97-5e3e06caa1c0" (UID: "011d39c2-528c-42f0-8a97-5e3e06caa1c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.474548 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.475590 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/011d39c2-528c-42f0-8a97-5e3e06caa1c0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.475721 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm2xr\" (UniqueName: \"kubernetes.io/projected/011d39c2-528c-42f0-8a97-5e3e06caa1c0-kube-api-access-sm2xr\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.475808 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.475953 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.476079 4789 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/011d39c2-528c-42f0-8a97-5e3e06caa1c0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.964713 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m7qnf" event={"ID":"011d39c2-528c-42f0-8a97-5e3e06caa1c0","Type":"ContainerDied","Data":"29d05fd9ca380b6930a7765acaf50d25f38131bea10be6b88f39edeb91a81699"} Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.964751 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d05fd9ca380b6930a7765acaf50d25f38131bea10be6b88f39edeb91a81699" Dec 16 08:20:20 crc kubenswrapper[4789]: I1216 08:20:20.964786 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m7qnf" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.376131 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86f8595455-rc2h8"] Dec 16 08:20:21 crc kubenswrapper[4789]: E1216 08:20:21.376529 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011d39c2-528c-42f0-8a97-5e3e06caa1c0" containerName="cinder-db-sync" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.376541 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="011d39c2-528c-42f0-8a97-5e3e06caa1c0" containerName="cinder-db-sync" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.376744 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="011d39c2-528c-42f0-8a97-5e3e06caa1c0" containerName="cinder-db-sync" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.377679 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.397585 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfk8\" (UniqueName: \"kubernetes.io/projected/4094f3ce-80f9-4542-ab32-9fb80b03b11b-kube-api-access-hlfk8\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.397636 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-sb\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.397789 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-config\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.397852 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-dns-svc\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.397906 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-nb\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.402232 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86f8595455-rc2h8"] Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.499803 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-dns-svc\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.499902 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-nb\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.500006 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfk8\" (UniqueName: \"kubernetes.io/projected/4094f3ce-80f9-4542-ab32-9fb80b03b11b-kube-api-access-hlfk8\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.500035 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-sb\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.500080 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-config\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.501728 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-dns-svc\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.501751 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-sb\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.501875 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-config\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.501900 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-nb\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.519521 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfk8\" (UniqueName: \"kubernetes.io/projected/4094f3ce-80f9-4542-ab32-9fb80b03b11b-kube-api-access-hlfk8\") pod \"dnsmasq-dns-86f8595455-rc2h8\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.531290 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.534287 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.538218 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.538223 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.538254 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nlmz9" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.539357 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.543390 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.698454 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.702543 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data-custom\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.702843 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/302f90e8-3193-47ca-ada3-03a19e0d8f32-etc-machine-id\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.702892 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302f90e8-3193-47ca-ada3-03a19e0d8f32-logs\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.702931 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwb4t\" (UniqueName: \"kubernetes.io/projected/302f90e8-3193-47ca-ada3-03a19e0d8f32-kube-api-access-xwb4t\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.703080 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.703221 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-scripts\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.703313 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.804734 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.804796 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-scripts\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.804830 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.804854 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data-custom\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.804900 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/302f90e8-3193-47ca-ada3-03a19e0d8f32-etc-machine-id\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.804960 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302f90e8-3193-47ca-ada3-03a19e0d8f32-logs\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.804978 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwb4t\" (UniqueName: \"kubernetes.io/projected/302f90e8-3193-47ca-ada3-03a19e0d8f32-kube-api-access-xwb4t\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.805074 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/302f90e8-3193-47ca-ada3-03a19e0d8f32-etc-machine-id\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.805583 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302f90e8-3193-47ca-ada3-03a19e0d8f32-logs\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.808810 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-scripts\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.809006 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.809974 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.810712 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data-custom\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.825463 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwb4t\" (UniqueName: \"kubernetes.io/projected/302f90e8-3193-47ca-ada3-03a19e0d8f32-kube-api-access-xwb4t\") pod \"cinder-api-0\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " pod="openstack/cinder-api-0" Dec 16 08:20:21 crc kubenswrapper[4789]: I1216 08:20:21.896058 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:20:22 crc kubenswrapper[4789]: I1216 08:20:22.275732 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86f8595455-rc2h8"] Dec 16 08:20:22 crc kubenswrapper[4789]: I1216 08:20:22.451805 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:22 crc kubenswrapper[4789]: I1216 08:20:22.986392 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"302f90e8-3193-47ca-ada3-03a19e0d8f32","Type":"ContainerStarted","Data":"9353c0029bb8388061a7388debfb0be0aead88ac59730e7574087aa33b4c4599"} Dec 16 08:20:22 crc kubenswrapper[4789]: I1216 08:20:22.990735 4789 generic.go:334] "Generic (PLEG): container finished" podID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerID="b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc" exitCode=0 Dec 16 08:20:22 crc kubenswrapper[4789]: I1216 08:20:22.990773 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" event={"ID":"4094f3ce-80f9-4542-ab32-9fb80b03b11b","Type":"ContainerDied","Data":"b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc"} Dec 16 08:20:22 crc kubenswrapper[4789]: I1216 08:20:22.990799 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" event={"ID":"4094f3ce-80f9-4542-ab32-9fb80b03b11b","Type":"ContainerStarted","Data":"a248f107961e40338c09a322a826d38dd1d7589d5de160420f53ece996bb19d8"} Dec 16 08:20:24 crc kubenswrapper[4789]: I1216 08:20:24.006868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"302f90e8-3193-47ca-ada3-03a19e0d8f32","Type":"ContainerStarted","Data":"45f8f9ae2e88c319d111ccd07788b7082f88c961f1415e62d8815ec42ac556f0"} Dec 16 08:20:24 crc kubenswrapper[4789]: I1216 08:20:24.007203 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 08:20:24 crc kubenswrapper[4789]: I1216 08:20:24.007217 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"302f90e8-3193-47ca-ada3-03a19e0d8f32","Type":"ContainerStarted","Data":"7119954074edc2147b9f80c3e275ef3c65846c611f8d24020542151fdc24dfed"} Dec 16 08:20:24 crc kubenswrapper[4789]: I1216 08:20:24.009248 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" event={"ID":"4094f3ce-80f9-4542-ab32-9fb80b03b11b","Type":"ContainerStarted","Data":"6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1"} Dec 16 08:20:24 crc kubenswrapper[4789]: I1216 08:20:24.009631 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:24 crc kubenswrapper[4789]: I1216 08:20:24.028220 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.02820306 podStartE2EDuration="3.02820306s" podCreationTimestamp="2025-12-16 08:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:24.024645863 +0000 UTC m=+5362.286533492" watchObservedRunningTime="2025-12-16 08:20:24.02820306 +0000 UTC m=+5362.290090689" Dec 16 08:20:24 crc kubenswrapper[4789]: I1216 08:20:24.060596 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" podStartSLOduration=3.060580922 podStartE2EDuration="3.060580922s" podCreationTimestamp="2025-12-16 08:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:24.060402608 +0000 UTC m=+5362.322290237" watchObservedRunningTime="2025-12-16 08:20:24.060580922 +0000 UTC m=+5362.322468551" Dec 16 08:20:31 crc kubenswrapper[4789]: I1216 08:20:31.700085 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:20:31 crc kubenswrapper[4789]: I1216 08:20:31.764369 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d8668765-x94jg"] Dec 16 08:20:31 crc kubenswrapper[4789]: I1216 08:20:31.764616 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d8668765-x94jg" podUID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerName="dnsmasq-dns" containerID="cri-o://5a51d5d3e116521c9c6719588539232c77dabed6d40d1d88710cf84db8d2ebcf" gracePeriod=10 Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.103455 4789 generic.go:334] "Generic (PLEG): container finished" podID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerID="5a51d5d3e116521c9c6719588539232c77dabed6d40d1d88710cf84db8d2ebcf" exitCode=0 Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.103795 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8668765-x94jg" event={"ID":"6e45f49a-ddb6-4f06-83b7-3492b38128ce","Type":"ContainerDied","Data":"5a51d5d3e116521c9c6719588539232c77dabed6d40d1d88710cf84db8d2ebcf"} Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.112924 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:20:32 crc kubenswrapper[4789]: E1216 08:20:32.113115 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.303876 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.420685 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-dns-svc\") pod \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.420747 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-config\") pod \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.421025 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-nb\") pod \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.421073 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x7fg\" (UniqueName: \"kubernetes.io/projected/6e45f49a-ddb6-4f06-83b7-3492b38128ce-kube-api-access-5x7fg\") pod \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.421115 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-sb\") pod \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\" (UID: \"6e45f49a-ddb6-4f06-83b7-3492b38128ce\") " Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.451972 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e45f49a-ddb6-4f06-83b7-3492b38128ce-kube-api-access-5x7fg" (OuterVolumeSpecName: "kube-api-access-5x7fg") pod "6e45f49a-ddb6-4f06-83b7-3492b38128ce" (UID: "6e45f49a-ddb6-4f06-83b7-3492b38128ce"). InnerVolumeSpecName "kube-api-access-5x7fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.481231 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e45f49a-ddb6-4f06-83b7-3492b38128ce" (UID: "6e45f49a-ddb6-4f06-83b7-3492b38128ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.482549 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e45f49a-ddb6-4f06-83b7-3492b38128ce" (UID: "6e45f49a-ddb6-4f06-83b7-3492b38128ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.491430 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e45f49a-ddb6-4f06-83b7-3492b38128ce" (UID: "6e45f49a-ddb6-4f06-83b7-3492b38128ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.512030 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-config" (OuterVolumeSpecName: "config") pod "6e45f49a-ddb6-4f06-83b7-3492b38128ce" (UID: "6e45f49a-ddb6-4f06-83b7-3492b38128ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.523264 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.523305 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.523318 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.523332 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x7fg\" (UniqueName: \"kubernetes.io/projected/6e45f49a-ddb6-4f06-83b7-3492b38128ce-kube-api-access-5x7fg\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:32 crc kubenswrapper[4789]: I1216 08:20:32.523343 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e45f49a-ddb6-4f06-83b7-3492b38128ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.114211 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d8668765-x94jg" event={"ID":"6e45f49a-ddb6-4f06-83b7-3492b38128ce","Type":"ContainerDied","Data":"75b34cccb9e89cf9b6a450d6b3598d7a8bdcebd2d5bfbe92cbc4a0c066cc53cf"} Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.114263 4789 scope.go:117] "RemoveContainer" containerID="5a51d5d3e116521c9c6719588539232c77dabed6d40d1d88710cf84db8d2ebcf" Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.114293 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d8668765-x94jg" Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.165993 4789 scope.go:117] "RemoveContainer" containerID="79c140cdd409de11a720345fc5bf860956f4328bde37ab51c6a4162a85832052" Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.166739 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.166990 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7096ea39-3a7b-4eca-b47b-953462331ae8" containerName="nova-scheduler-scheduler" containerID="cri-o://90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.228303 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.228603 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-log" containerID="cri-o://86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.228788 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-api" containerID="cri-o://5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.247664 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.247933 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="ec472b51-08c1-499e-8b85-e103741b35d8" containerName="nova-cell0-conductor-conductor" containerID="cri-o://dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.288617 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.288866 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7864394a-9bf8-40d1-b8a8-8b5b989516bb" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.301493 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d8668765-x94jg"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.310517 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.310773 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-log" containerID="cri-o://8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.310943 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-metadata" containerID="cri-o://6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.323191 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d8668765-x94jg"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.333883 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:20:33 crc kubenswrapper[4789]: I1216 08:20:33.334125 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="3a2c87a8-8e65-4763-9ae8-1507026f0904" containerName="nova-cell1-conductor-conductor" containerID="cri-o://8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" gracePeriod=30 Dec 16 08:20:33 crc kubenswrapper[4789]: E1216 08:20:33.361059 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 08:20:33 crc kubenswrapper[4789]: E1216 08:20:33.368024 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 08:20:33 crc kubenswrapper[4789]: E1216 08:20:33.378987 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 08:20:33 crc kubenswrapper[4789]: E1216 08:20:33.379053 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="3a2c87a8-8e65-4763-9ae8-1507026f0904" containerName="nova-cell1-conductor-conductor" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.069957 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.115330 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" path="/var/lib/kubelet/pods/6e45f49a-ddb6-4f06-83b7-3492b38128ce/volumes" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.135270 4789 generic.go:334] "Generic (PLEG): container finished" podID="7864394a-9bf8-40d1-b8a8-8b5b989516bb" containerID="9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9" exitCode=0 Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.135330 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.135372 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7864394a-9bf8-40d1-b8a8-8b5b989516bb","Type":"ContainerDied","Data":"9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9"} Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.135402 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7864394a-9bf8-40d1-b8a8-8b5b989516bb","Type":"ContainerDied","Data":"4cfc6452302d4d167c76cf40ad3bee41b36084805c1d77d7b712927c5f64decb"} Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.135420 4789 scope.go:117] "RemoveContainer" containerID="9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.143236 4789 generic.go:334] "Generic (PLEG): container finished" podID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerID="8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04" exitCode=143 Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.143578 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a","Type":"ContainerDied","Data":"8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04"} Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.156678 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.156865 4789 generic.go:334] "Generic (PLEG): container finished" podID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerID="86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494" exitCode=143 Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.156949 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a1a792a-ba73-4aa6-ad00-a15960cdecef","Type":"ContainerDied","Data":"86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494"} Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.181117 4789 scope.go:117] "RemoveContainer" containerID="9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9" Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.181603 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9\": container with ID starting with 9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9 not found: ID does not exist" containerID="9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.181633 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9"} err="failed to get container status \"9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9\": rpc error: code = NotFound desc = could not find container \"9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9\": container with ID starting with 9ed313e6db68f164c11a3a34a47b7356c7c210dee682f7aa9f45e2e2769141a9 not found: ID does not exist" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.196785 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjp9q\" (UniqueName: \"kubernetes.io/projected/7864394a-9bf8-40d1-b8a8-8b5b989516bb-kube-api-access-fjp9q\") pod \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.197367 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-config-data\") pod \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.197563 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-combined-ca-bundle\") pod \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\" (UID: \"7864394a-9bf8-40d1-b8a8-8b5b989516bb\") " Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.225050 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7864394a-9bf8-40d1-b8a8-8b5b989516bb-kube-api-access-fjp9q" (OuterVolumeSpecName: "kube-api-access-fjp9q") pod "7864394a-9bf8-40d1-b8a8-8b5b989516bb" (UID: "7864394a-9bf8-40d1-b8a8-8b5b989516bb"). InnerVolumeSpecName "kube-api-access-fjp9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.280058 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7864394a-9bf8-40d1-b8a8-8b5b989516bb" (UID: "7864394a-9bf8-40d1-b8a8-8b5b989516bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.292189 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-config-data" (OuterVolumeSpecName: "config-data") pod "7864394a-9bf8-40d1-b8a8-8b5b989516bb" (UID: "7864394a-9bf8-40d1-b8a8-8b5b989516bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.309152 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.309182 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7864394a-9bf8-40d1-b8a8-8b5b989516bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.309196 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjp9q\" (UniqueName: \"kubernetes.io/projected/7864394a-9bf8-40d1-b8a8-8b5b989516bb-kube-api-access-fjp9q\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.510085 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.541339 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.555312 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.557004 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.563887 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.564299 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerName="init" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.564317 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerName="init" Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.564331 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7864394a-9bf8-40d1-b8a8-8b5b989516bb" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.564337 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7864394a-9bf8-40d1-b8a8-8b5b989516bb" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.564351 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerName="dnsmasq-dns" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.564356 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerName="dnsmasq-dns" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.564542 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e45f49a-ddb6-4f06-83b7-3492b38128ce" containerName="dnsmasq-dns" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.564560 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7864394a-9bf8-40d1-b8a8-8b5b989516bb" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.565165 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.573315 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.573646 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7096ea39-3a7b-4eca-b47b-953462331ae8" containerName="nova-scheduler-scheduler" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.573995 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.576482 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.618121 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x756\" (UniqueName: \"kubernetes.io/projected/84c5c815-268d-487d-a36e-4f9db5e4ae44-kube-api-access-2x756\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.618202 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c5c815-268d-487d-a36e-4f9db5e4ae44-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.619211 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c5c815-268d-487d-a36e-4f9db5e4ae44-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.720088 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x756\" (UniqueName: \"kubernetes.io/projected/84c5c815-268d-487d-a36e-4f9db5e4ae44-kube-api-access-2x756\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.720141 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c5c815-268d-487d-a36e-4f9db5e4ae44-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.720180 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c5c815-268d-487d-a36e-4f9db5e4ae44-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.723889 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c5c815-268d-487d-a36e-4f9db5e4ae44-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.724364 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c5c815-268d-487d-a36e-4f9db5e4ae44-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.737637 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x756\" (UniqueName: \"kubernetes.io/projected/84c5c815-268d-487d-a36e-4f9db5e4ae44-kube-api-access-2x756\") pod \"nova-cell1-novncproxy-0\" (UID: \"84c5c815-268d-487d-a36e-4f9db5e4ae44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: I1216 08:20:34.889440 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.982592 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.987277 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.996268 4789 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 08:20:34 crc kubenswrapper[4789]: E1216 08:20:34.996350 4789 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="ec472b51-08c1-499e-8b85-e103741b35d8" containerName="nova-cell0-conductor-conductor" Dec 16 08:20:35 crc kubenswrapper[4789]: I1216 08:20:35.328195 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 08:20:35 crc kubenswrapper[4789]: W1216 08:20:35.333140 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c5c815_268d_487d_a36e_4f9db5e4ae44.slice/crio-099a2dd71bd4cf12752fac6e4f9ad6d93604e0c48448ed7657e3ea56def698cd WatchSource:0}: Error finding container 099a2dd71bd4cf12752fac6e4f9ad6d93604e0c48448ed7657e3ea56def698cd: Status 404 returned error can't find the container with id 099a2dd71bd4cf12752fac6e4f9ad6d93604e0c48448ed7657e3ea56def698cd Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.117314 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7864394a-9bf8-40d1-b8a8-8b5b989516bb" path="/var/lib/kubelet/pods/7864394a-9bf8-40d1-b8a8-8b5b989516bb/volumes" Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.176287 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84c5c815-268d-487d-a36e-4f9db5e4ae44","Type":"ContainerStarted","Data":"3483b2bd3e601679580a0ba4537298f4d0585487575b6aa07c31d75bf6510b3e"} Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.176649 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84c5c815-268d-487d-a36e-4f9db5e4ae44","Type":"ContainerStarted","Data":"099a2dd71bd4cf12752fac6e4f9ad6d93604e0c48448ed7657e3ea56def698cd"} Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.199714 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.199692827 podStartE2EDuration="2.199692827s" podCreationTimestamp="2025-12-16 08:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:36.190662735 +0000 UTC m=+5374.452550364" watchObservedRunningTime="2025-12-16 08:20:36.199692827 +0000 UTC m=+5374.461580456" Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.630743 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": dial tcp 10.217.1.73:8775: connect: connection refused" Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.631097 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": dial tcp 10.217.1.73:8775: connect: connection refused" Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.951846 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:20:36 crc kubenswrapper[4789]: I1216 08:20:36.957757 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068169 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-logs\") pod \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068266 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a792a-ba73-4aa6-ad00-a15960cdecef-logs\") pod \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068314 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6bwq\" (UniqueName: \"kubernetes.io/projected/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-kube-api-access-t6bwq\") pod \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068375 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-config-data\") pod \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068413 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjgsx\" (UniqueName: \"kubernetes.io/projected/0a1a792a-ba73-4aa6-ad00-a15960cdecef-kube-api-access-hjgsx\") pod \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068447 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-combined-ca-bundle\") pod \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\" (UID: \"0a1a792a-ba73-4aa6-ad00-a15960cdecef\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068524 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-combined-ca-bundle\") pod \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.068564 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-config-data\") pod \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\" (UID: \"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a\") " Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.086177 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a1a792a-ba73-4aa6-ad00-a15960cdecef-logs" (OuterVolumeSpecName: "logs") pod "0a1a792a-ba73-4aa6-ad00-a15960cdecef" (UID: "0a1a792a-ba73-4aa6-ad00-a15960cdecef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.086537 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-logs" (OuterVolumeSpecName: "logs") pod "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" (UID: "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.105094 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1a792a-ba73-4aa6-ad00-a15960cdecef-kube-api-access-hjgsx" (OuterVolumeSpecName: "kube-api-access-hjgsx") pod "0a1a792a-ba73-4aa6-ad00-a15960cdecef" (UID: "0a1a792a-ba73-4aa6-ad00-a15960cdecef"). InnerVolumeSpecName "kube-api-access-hjgsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.124368 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-config-data" (OuterVolumeSpecName: "config-data") pod "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" (UID: "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.124465 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-kube-api-access-t6bwq" (OuterVolumeSpecName: "kube-api-access-t6bwq") pod "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" (UID: "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a"). InnerVolumeSpecName "kube-api-access-t6bwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.152033 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a1a792a-ba73-4aa6-ad00-a15960cdecef" (UID: "0a1a792a-ba73-4aa6-ad00-a15960cdecef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.154039 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-config-data" (OuterVolumeSpecName: "config-data") pod "0a1a792a-ba73-4aa6-ad00-a15960cdecef" (UID: "0a1a792a-ba73-4aa6-ad00-a15960cdecef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.171184 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a1a792a-ba73-4aa6-ad00-a15960cdecef-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.171225 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6bwq\" (UniqueName: \"kubernetes.io/projected/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-kube-api-access-t6bwq\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.171239 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.171252 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjgsx\" (UniqueName: \"kubernetes.io/projected/0a1a792a-ba73-4aa6-ad00-a15960cdecef-kube-api-access-hjgsx\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.171266 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a1a792a-ba73-4aa6-ad00-a15960cdecef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.171277 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.171288 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.173362 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" (UID: "f1c545d5-cbc5-4506-99dc-4ac2b48abb6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.202138 4789 generic.go:334] "Generic (PLEG): container finished" podID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerID="5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98" exitCode=0 Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.202583 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.202967 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a1a792a-ba73-4aa6-ad00-a15960cdecef","Type":"ContainerDied","Data":"5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98"} Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.203046 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0a1a792a-ba73-4aa6-ad00-a15960cdecef","Type":"ContainerDied","Data":"7b37cb619fec967c402869ba4ff7d5a4dc9e2c69a1633b57743b2e8a3680ea99"} Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.203067 4789 scope.go:117] "RemoveContainer" containerID="5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.212202 4789 generic.go:334] "Generic (PLEG): container finished" podID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerID="6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8" exitCode=0 Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.213061 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.216961 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a","Type":"ContainerDied","Data":"6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8"} Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.217165 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1c545d5-cbc5-4506-99dc-4ac2b48abb6a","Type":"ContainerDied","Data":"8e37fdb9d882daec8f22322c8a895197b75f2923905335ac654b9e505beaedb1"} Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.241559 4789 scope.go:117] "RemoveContainer" containerID="86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.249840 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.258354 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.286314 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.287977 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.288421 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-api" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288444 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-api" Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.288469 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-log" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288478 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-log" Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.288499 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-metadata" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288506 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-metadata" Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.288547 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-log" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288556 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-log" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288792 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-api" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288832 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" containerName="nova-api-log" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288852 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-log" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.288866 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" containerName="nova-metadata-metadata" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.289217 4789 scope.go:117] "RemoveContainer" containerID="5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98" Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.289701 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98\": container with ID starting with 5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98 not found: ID does not exist" containerID="5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.289744 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98"} err="failed to get container status \"5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98\": rpc error: code = NotFound desc = could not find container \"5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98\": container with ID starting with 5ba4a324e295d6fd2ecbc66090dc4b6f6122756ce7f6ac6bad59d4520e7bfe98 not found: ID does not exist" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.289766 4789 scope.go:117] "RemoveContainer" containerID="86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494" Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.290001 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494\": container with ID starting with 86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494 not found: ID does not exist" containerID="86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.290020 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494"} err="failed to get container status \"86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494\": rpc error: code = NotFound desc = could not find container \"86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494\": container with ID starting with 86f02ced9123cdf9b5d02f8acd31b50cf5e39604c483fc6f07c5100d40d75494 not found: ID does not exist" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.290034 4789 scope.go:117] "RemoveContainer" containerID="6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.308377 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.308497 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.315517 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.323734 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.327332 4789 scope.go:117] "RemoveContainer" containerID="8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.342040 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.357093 4789 scope.go:117] "RemoveContainer" containerID="6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8" Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.358986 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8\": container with ID starting with 6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8 not found: ID does not exist" containerID="6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.359031 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8"} err="failed to get container status \"6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8\": rpc error: code = NotFound desc = could not find container \"6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8\": container with ID starting with 6e5a23aeb7fd6b6dd142eb623e2dab4d1f1cde9a9fb5a0269e47031ccf35aea8 not found: ID does not exist" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.359054 4789 scope.go:117] "RemoveContainer" containerID="8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04" Dec 16 08:20:37 crc kubenswrapper[4789]: E1216 08:20:37.359456 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04\": container with ID starting with 8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04 not found: ID does not exist" containerID="8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.359480 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04"} err="failed to get container status \"8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04\": rpc error: code = NotFound desc = could not find container \"8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04\": container with ID starting with 8ca79953790c0a02e1f56f94ed49b70ef354e1746ac50e3623bea2908762ab04 not found: ID does not exist" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.366991 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.368566 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.371464 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.388072 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6465ba6-5f85-4b46-baea-61349bea2e86-logs\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.388112 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.388137 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jwr4\" (UniqueName: \"kubernetes.io/projected/c6465ba6-5f85-4b46-baea-61349bea2e86-kube-api-access-4jwr4\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.388182 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-config-data\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.396362 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.490609 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6465ba6-5f85-4b46-baea-61349bea2e86-logs\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.490678 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.490738 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jwr4\" (UniqueName: \"kubernetes.io/projected/c6465ba6-5f85-4b46-baea-61349bea2e86-kube-api-access-4jwr4\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.490764 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.490856 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d2253f-62e3-4a40-a1ff-66802515e914-logs\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.490905 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-config-data\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.491112 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-config-data\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.491238 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtk4\" (UniqueName: \"kubernetes.io/projected/83d2253f-62e3-4a40-a1ff-66802515e914-kube-api-access-7xtk4\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.491744 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6465ba6-5f85-4b46-baea-61349bea2e86-logs\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.496556 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.497228 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-config-data\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.509494 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jwr4\" (UniqueName: \"kubernetes.io/projected/c6465ba6-5f85-4b46-baea-61349bea2e86-kube-api-access-4jwr4\") pod \"nova-api-0\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.593508 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.593588 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d2253f-62e3-4a40-a1ff-66802515e914-logs\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.593630 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-config-data\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.593714 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtk4\" (UniqueName: \"kubernetes.io/projected/83d2253f-62e3-4a40-a1ff-66802515e914-kube-api-access-7xtk4\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.594400 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d2253f-62e3-4a40-a1ff-66802515e914-logs\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.599654 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.599778 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-config-data\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.623447 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtk4\" (UniqueName: \"kubernetes.io/projected/83d2253f-62e3-4a40-a1ff-66802515e914-kube-api-access-7xtk4\") pod \"nova-metadata-0\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " pod="openstack/nova-metadata-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.642581 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 08:20:37 crc kubenswrapper[4789]: I1216 08:20:37.695495 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.130226 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a1a792a-ba73-4aa6-ad00-a15960cdecef" path="/var/lib/kubelet/pods/0a1a792a-ba73-4aa6-ad00-a15960cdecef/volumes" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.131294 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c545d5-cbc5-4506-99dc-4ac2b48abb6a" path="/var/lib/kubelet/pods/f1c545d5-cbc5-4506-99dc-4ac2b48abb6a/volumes" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.225076 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.233853 4789 generic.go:334] "Generic (PLEG): container finished" podID="3a2c87a8-8e65-4763-9ae8-1507026f0904" containerID="8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" exitCode=0 Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.233927 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3a2c87a8-8e65-4763-9ae8-1507026f0904","Type":"ContainerDied","Data":"8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae"} Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.233953 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3a2c87a8-8e65-4763-9ae8-1507026f0904","Type":"ContainerDied","Data":"a46b582c389621d19223119e37163b46b6526a08f9d616739533dec50ad549f2"} Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.233970 4789 scope.go:117] "RemoveContainer" containerID="8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.233965 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.262783 4789 scope.go:117] "RemoveContainer" containerID="8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" Dec 16 08:20:38 crc kubenswrapper[4789]: E1216 08:20:38.268392 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae\": container with ID starting with 8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae not found: ID does not exist" containerID="8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.268448 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae"} err="failed to get container status \"8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae\": rpc error: code = NotFound desc = could not find container \"8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae\": container with ID starting with 8661d7de5d4c47b0a2dc2f767438dfdbda088d18bbe2a9bf3db1aa8d006c93ae not found: ID does not exist" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.304384 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qgxn\" (UniqueName: \"kubernetes.io/projected/3a2c87a8-8e65-4763-9ae8-1507026f0904-kube-api-access-6qgxn\") pod \"3a2c87a8-8e65-4763-9ae8-1507026f0904\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.304485 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-config-data\") pod \"3a2c87a8-8e65-4763-9ae8-1507026f0904\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.304591 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-combined-ca-bundle\") pod \"3a2c87a8-8e65-4763-9ae8-1507026f0904\" (UID: \"3a2c87a8-8e65-4763-9ae8-1507026f0904\") " Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.312164 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2c87a8-8e65-4763-9ae8-1507026f0904-kube-api-access-6qgxn" (OuterVolumeSpecName: "kube-api-access-6qgxn") pod "3a2c87a8-8e65-4763-9ae8-1507026f0904" (UID: "3a2c87a8-8e65-4763-9ae8-1507026f0904"). InnerVolumeSpecName "kube-api-access-6qgxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.338006 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a2c87a8-8e65-4763-9ae8-1507026f0904" (UID: "3a2c87a8-8e65-4763-9ae8-1507026f0904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.347591 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-config-data" (OuterVolumeSpecName: "config-data") pod "3a2c87a8-8e65-4763-9ae8-1507026f0904" (UID: "3a2c87a8-8e65-4763-9ae8-1507026f0904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.368901 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.406585 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qgxn\" (UniqueName: \"kubernetes.io/projected/3a2c87a8-8e65-4763-9ae8-1507026f0904-kube-api-access-6qgxn\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.406613 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.406625 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2c87a8-8e65-4763-9ae8-1507026f0904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.481090 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 08:20:38 crc kubenswrapper[4789]: W1216 08:20:38.481214 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6465ba6_5f85_4b46_baea_61349bea2e86.slice/crio-90fa201ace39bf0e8255f479d8bb34a1671bb37eab994a10646dac961568e1b7 WatchSource:0}: Error finding container 90fa201ace39bf0e8255f479d8bb34a1671bb37eab994a10646dac961568e1b7: Status 404 returned error can't find the container with id 90fa201ace39bf0e8255f479d8bb34a1671bb37eab994a10646dac961568e1b7 Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.755834 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.779134 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.787959 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:20:38 crc kubenswrapper[4789]: E1216 08:20:38.788565 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2c87a8-8e65-4763-9ae8-1507026f0904" containerName="nova-cell1-conductor-conductor" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.788581 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2c87a8-8e65-4763-9ae8-1507026f0904" containerName="nova-cell1-conductor-conductor" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.788894 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2c87a8-8e65-4763-9ae8-1507026f0904" containerName="nova-cell1-conductor-conductor" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.789810 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.794317 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.795673 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.909600 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.946480 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.946763 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:38 crc kubenswrapper[4789]: I1216 08:20:38.946887 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8bvp\" (UniqueName: \"kubernetes.io/projected/e68d94f6-4320-41cf-a86c-6ad140e2773a-kube-api-access-p8bvp\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.047643 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-combined-ca-bundle\") pod \"7096ea39-3a7b-4eca-b47b-953462331ae8\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.047780 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6c9c\" (UniqueName: \"kubernetes.io/projected/7096ea39-3a7b-4eca-b47b-953462331ae8-kube-api-access-t6c9c\") pod \"7096ea39-3a7b-4eca-b47b-953462331ae8\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.047820 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-config-data\") pod \"7096ea39-3a7b-4eca-b47b-953462331ae8\" (UID: \"7096ea39-3a7b-4eca-b47b-953462331ae8\") " Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.048140 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.048171 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.048232 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8bvp\" (UniqueName: \"kubernetes.io/projected/e68d94f6-4320-41cf-a86c-6ad140e2773a-kube-api-access-p8bvp\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.052510 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7096ea39-3a7b-4eca-b47b-953462331ae8-kube-api-access-t6c9c" (OuterVolumeSpecName: "kube-api-access-t6c9c") pod "7096ea39-3a7b-4eca-b47b-953462331ae8" (UID: "7096ea39-3a7b-4eca-b47b-953462331ae8"). InnerVolumeSpecName "kube-api-access-t6c9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.053034 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.053155 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.086244 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8bvp\" (UniqueName: \"kubernetes.io/projected/e68d94f6-4320-41cf-a86c-6ad140e2773a-kube-api-access-p8bvp\") pod \"nova-cell1-conductor-0\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.096553 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.099314 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7096ea39-3a7b-4eca-b47b-953462331ae8" (UID: "7096ea39-3a7b-4eca-b47b-953462331ae8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.140547 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-config-data" (OuterVolumeSpecName: "config-data") pod "7096ea39-3a7b-4eca-b47b-953462331ae8" (UID: "7096ea39-3a7b-4eca-b47b-953462331ae8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.150155 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.150181 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6c9c\" (UniqueName: \"kubernetes.io/projected/7096ea39-3a7b-4eca-b47b-953462331ae8-kube-api-access-t6c9c\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.150192 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7096ea39-3a7b-4eca-b47b-953462331ae8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.249001 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.251532 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4zm7\" (UniqueName: \"kubernetes.io/projected/ec472b51-08c1-499e-8b85-e103741b35d8-kube-api-access-l4zm7\") pod \"ec472b51-08c1-499e-8b85-e103741b35d8\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.251612 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-combined-ca-bundle\") pod \"ec472b51-08c1-499e-8b85-e103741b35d8\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.251714 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-config-data\") pod \"ec472b51-08c1-499e-8b85-e103741b35d8\" (UID: \"ec472b51-08c1-499e-8b85-e103741b35d8\") " Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.259085 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec472b51-08c1-499e-8b85-e103741b35d8-kube-api-access-l4zm7" (OuterVolumeSpecName: "kube-api-access-l4zm7") pod "ec472b51-08c1-499e-8b85-e103741b35d8" (UID: "ec472b51-08c1-499e-8b85-e103741b35d8"). InnerVolumeSpecName "kube-api-access-l4zm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.285117 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec472b51-08c1-499e-8b85-e103741b35d8" (UID: "ec472b51-08c1-499e-8b85-e103741b35d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.287825 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-config-data" (OuterVolumeSpecName: "config-data") pod "ec472b51-08c1-499e-8b85-e103741b35d8" (UID: "ec472b51-08c1-499e-8b85-e103741b35d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.290686 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83d2253f-62e3-4a40-a1ff-66802515e914","Type":"ContainerStarted","Data":"1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.290730 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83d2253f-62e3-4a40-a1ff-66802515e914","Type":"ContainerStarted","Data":"4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.290739 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83d2253f-62e3-4a40-a1ff-66802515e914","Type":"ContainerStarted","Data":"0ec106db53e5ece5d29ef80eef15879e737dcb015db2784c9e67cada7a677743"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.296680 4789 generic.go:334] "Generic (PLEG): container finished" podID="7096ea39-3a7b-4eca-b47b-953462331ae8" containerID="90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" exitCode=0 Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.296766 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7096ea39-3a7b-4eca-b47b-953462331ae8","Type":"ContainerDied","Data":"90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.296791 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7096ea39-3a7b-4eca-b47b-953462331ae8","Type":"ContainerDied","Data":"1635cfa2a17948cca46761ea36545dcfe8226ee92aa1272b7df7ceb8872dd220"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.296807 4789 scope.go:117] "RemoveContainer" containerID="90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.296942 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.299531 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6465ba6-5f85-4b46-baea-61349bea2e86","Type":"ContainerStarted","Data":"86c3b03baaccb18ae314a191d2fa45e7d1371b9bc7babc5165b6d3913214685c"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.299579 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6465ba6-5f85-4b46-baea-61349bea2e86","Type":"ContainerStarted","Data":"a4a1670d1a87a6d2ce3aaf61446e37348c93feb11559768fe701c88f1758b7f6"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.299595 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6465ba6-5f85-4b46-baea-61349bea2e86","Type":"ContainerStarted","Data":"90fa201ace39bf0e8255f479d8bb34a1671bb37eab994a10646dac961568e1b7"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.302654 4789 generic.go:334] "Generic (PLEG): container finished" podID="ec472b51-08c1-499e-8b85-e103741b35d8" containerID="dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" exitCode=0 Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.302691 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec472b51-08c1-499e-8b85-e103741b35d8","Type":"ContainerDied","Data":"dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.302712 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec472b51-08c1-499e-8b85-e103741b35d8","Type":"ContainerDied","Data":"45b55c2c3a485926d1efa79d1a89907c26ad2c0dcf8849e883300145d34d947a"} Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.302732 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.315771 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.315749269 podStartE2EDuration="2.315749269s" podCreationTimestamp="2025-12-16 08:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:39.312296836 +0000 UTC m=+5377.574184465" watchObservedRunningTime="2025-12-16 08:20:39.315749269 +0000 UTC m=+5377.577636898" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.324177 4789 scope.go:117] "RemoveContainer" containerID="90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" Dec 16 08:20:39 crc kubenswrapper[4789]: E1216 08:20:39.324562 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792\": container with ID starting with 90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792 not found: ID does not exist" containerID="90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.324602 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792"} err="failed to get container status \"90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792\": rpc error: code = NotFound desc = could not find container \"90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792\": container with ID starting with 90ff1da5af9497f435c7c24c07544988dd54fb50b6833adfdbe6cf35d8fc5792 not found: ID does not exist" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.324620 4789 scope.go:117] "RemoveContainer" containerID="dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.359771 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.359802 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4zm7\" (UniqueName: \"kubernetes.io/projected/ec472b51-08c1-499e-8b85-e103741b35d8-kube-api-access-l4zm7\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.359813 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec472b51-08c1-499e-8b85-e103741b35d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.365865 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.365846334 podStartE2EDuration="2.365846334s" podCreationTimestamp="2025-12-16 08:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:39.358433893 +0000 UTC m=+5377.620321512" watchObservedRunningTime="2025-12-16 08:20:39.365846334 +0000 UTC m=+5377.627733963" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.389509 4789 scope.go:117] "RemoveContainer" containerID="dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" Dec 16 08:20:39 crc kubenswrapper[4789]: E1216 08:20:39.394510 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22\": container with ID starting with dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22 not found: ID does not exist" containerID="dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.394591 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22"} err="failed to get container status \"dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22\": rpc error: code = NotFound desc = could not find container \"dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22\": container with ID starting with dcc51713366398bfda1845c7bc56f0cf0a85638523922a76129c734cc21ace22 not found: ID does not exist" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.404945 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.413417 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.425060 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: E1216 08:20:39.425537 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec472b51-08c1-499e-8b85-e103741b35d8" containerName="nova-cell0-conductor-conductor" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.425562 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec472b51-08c1-499e-8b85-e103741b35d8" containerName="nova-cell0-conductor-conductor" Dec 16 08:20:39 crc kubenswrapper[4789]: E1216 08:20:39.425578 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7096ea39-3a7b-4eca-b47b-953462331ae8" containerName="nova-scheduler-scheduler" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.425585 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7096ea39-3a7b-4eca-b47b-953462331ae8" containerName="nova-scheduler-scheduler" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.425759 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec472b51-08c1-499e-8b85-e103741b35d8" containerName="nova-cell0-conductor-conductor" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.425789 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7096ea39-3a7b-4eca-b47b-953462331ae8" containerName="nova-scheduler-scheduler" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.426373 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.429056 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.444953 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.466548 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.522625 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.545640 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.546788 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.557972 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.558535 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.561962 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.562088 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.562112 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fgdq\" (UniqueName: \"kubernetes.io/projected/70802626-e689-4f46-b3e4-5c2b74cec5bb-kube-api-access-2fgdq\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.666025 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.666081 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fgdq\" (UniqueName: \"kubernetes.io/projected/70802626-e689-4f46-b3e4-5c2b74cec5bb-kube-api-access-2fgdq\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.666152 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.666202 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.666262 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fq67\" (UniqueName: \"kubernetes.io/projected/a38e49bf-c8bd-4581-81d8-04c735d9e281-kube-api-access-2fq67\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.666292 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-config-data\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.670994 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.671147 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.692683 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fgdq\" (UniqueName: \"kubernetes.io/projected/70802626-e689-4f46-b3e4-5c2b74cec5bb-kube-api-access-2fgdq\") pod \"nova-cell0-conductor-0\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.765585 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.769483 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fq67\" (UniqueName: \"kubernetes.io/projected/a38e49bf-c8bd-4581-81d8-04c735d9e281-kube-api-access-2fq67\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.769621 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-config-data\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.769816 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.777820 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.786867 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-config-data\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.789422 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fq67\" (UniqueName: \"kubernetes.io/projected/a38e49bf-c8bd-4581-81d8-04c735d9e281-kube-api-access-2fq67\") pod \"nova-scheduler-0\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.873513 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.885418 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 08:20:39 crc kubenswrapper[4789]: I1216 08:20:39.891086 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:39 crc kubenswrapper[4789]: W1216 08:20:39.921266 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode68d94f6_4320_41cf_a86c_6ad140e2773a.slice/crio-a80c342571c17d50a7506be7b38903b1a1b318c5020a6d22d9529894994fd7a0 WatchSource:0}: Error finding container a80c342571c17d50a7506be7b38903b1a1b318c5020a6d22d9529894994fd7a0: Status 404 returned error can't find the container with id a80c342571c17d50a7506be7b38903b1a1b318c5020a6d22d9529894994fd7a0 Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.124657 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2c87a8-8e65-4763-9ae8-1507026f0904" path="/var/lib/kubelet/pods/3a2c87a8-8e65-4763-9ae8-1507026f0904/volumes" Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.125964 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7096ea39-3a7b-4eca-b47b-953462331ae8" path="/var/lib/kubelet/pods/7096ea39-3a7b-4eca-b47b-953462331ae8/volumes" Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.126620 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec472b51-08c1-499e-8b85-e103741b35d8" path="/var/lib/kubelet/pods/ec472b51-08c1-499e-8b85-e103741b35d8/volumes" Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.313550 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e68d94f6-4320-41cf-a86c-6ad140e2773a","Type":"ContainerStarted","Data":"d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44"} Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.314233 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e68d94f6-4320-41cf-a86c-6ad140e2773a","Type":"ContainerStarted","Data":"a80c342571c17d50a7506be7b38903b1a1b318c5020a6d22d9529894994fd7a0"} Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.354965 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.354942774 podStartE2EDuration="2.354942774s" podCreationTimestamp="2025-12-16 08:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:40.33928317 +0000 UTC m=+5378.601170809" watchObservedRunningTime="2025-12-16 08:20:40.354942774 +0000 UTC m=+5378.616830403" Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.466579 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 08:20:40 crc kubenswrapper[4789]: W1216 08:20:40.472997 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70802626_e689_4f46_b3e4_5c2b74cec5bb.slice/crio-e97bfa9565512888c35589a2484263e9c8adc7ad3ef33fe72c1d709e676171e8 WatchSource:0}: Error finding container e97bfa9565512888c35589a2484263e9c8adc7ad3ef33fe72c1d709e676171e8: Status 404 returned error can't find the container with id e97bfa9565512888c35589a2484263e9c8adc7ad3ef33fe72c1d709e676171e8 Dec 16 08:20:40 crc kubenswrapper[4789]: I1216 08:20:40.477462 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 08:20:40 crc kubenswrapper[4789]: W1216 08:20:40.478139 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda38e49bf_c8bd_4581_81d8_04c735d9e281.slice/crio-52c4399ccd9797a0497da8e4f70bc65e0d6d91e69ead1f774ca29d7de9756588 WatchSource:0}: Error finding container 52c4399ccd9797a0497da8e4f70bc65e0d6d91e69ead1f774ca29d7de9756588: Status 404 returned error can't find the container with id 52c4399ccd9797a0497da8e4f70bc65e0d6d91e69ead1f774ca29d7de9756588 Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.322110 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a38e49bf-c8bd-4581-81d8-04c735d9e281","Type":"ContainerStarted","Data":"2b85906693eee84dbe4f22bc3650497471e1fcfa4651ef649bf3a8b85b3831a4"} Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.322405 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a38e49bf-c8bd-4581-81d8-04c735d9e281","Type":"ContainerStarted","Data":"52c4399ccd9797a0497da8e4f70bc65e0d6d91e69ead1f774ca29d7de9756588"} Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.324758 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"70802626-e689-4f46-b3e4-5c2b74cec5bb","Type":"ContainerStarted","Data":"bee44f846086f1f8459648f0221132f8bd82d20a15e603a91dd34127e94c38f4"} Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.324807 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.324823 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"70802626-e689-4f46-b3e4-5c2b74cec5bb","Type":"ContainerStarted","Data":"e97bfa9565512888c35589a2484263e9c8adc7ad3ef33fe72c1d709e676171e8"} Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.324835 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.347896 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.3478780759999998 podStartE2EDuration="2.347878076s" podCreationTimestamp="2025-12-16 08:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:41.343295504 +0000 UTC m=+5379.605183133" watchObservedRunningTime="2025-12-16 08:20:41.347878076 +0000 UTC m=+5379.609765705" Dec 16 08:20:41 crc kubenswrapper[4789]: I1216 08:20:41.364794 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.364773819 podStartE2EDuration="2.364773819s" podCreationTimestamp="2025-12-16 08:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:20:41.360138396 +0000 UTC m=+5379.622026025" watchObservedRunningTime="2025-12-16 08:20:41.364773819 +0000 UTC m=+5379.626661438" Dec 16 08:20:42 crc kubenswrapper[4789]: I1216 08:20:42.695665 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:20:42 crc kubenswrapper[4789]: I1216 08:20:42.696064 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 08:20:44 crc kubenswrapper[4789]: I1216 08:20:44.873668 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 08:20:44 crc kubenswrapper[4789]: I1216 08:20:44.891141 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:44 crc kubenswrapper[4789]: I1216 08:20:44.982544 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:45 crc kubenswrapper[4789]: I1216 08:20:45.363602 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 08:20:47 crc kubenswrapper[4789]: I1216 08:20:47.105091 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:20:47 crc kubenswrapper[4789]: E1216 08:20:47.105697 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:20:47 crc kubenswrapper[4789]: I1216 08:20:47.643162 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 08:20:47 crc kubenswrapper[4789]: I1216 08:20:47.643208 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 08:20:47 crc kubenswrapper[4789]: I1216 08:20:47.696644 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 08:20:47 crc kubenswrapper[4789]: I1216 08:20:47.696699 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 08:20:48 crc kubenswrapper[4789]: I1216 08:20:48.726129 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:20:48 crc kubenswrapper[4789]: I1216 08:20:48.726196 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:20:48 crc kubenswrapper[4789]: I1216 08:20:48.808178 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:20:48 crc kubenswrapper[4789]: I1216 08:20:48.808262 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:20:49 crc kubenswrapper[4789]: I1216 08:20:49.282443 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 08:20:49 crc kubenswrapper[4789]: I1216 08:20:49.799316 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 08:20:49 crc kubenswrapper[4789]: I1216 08:20:49.874596 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 08:20:49 crc kubenswrapper[4789]: I1216 08:20:49.904408 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 08:20:50 crc kubenswrapper[4789]: I1216 08:20:50.428599 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 08:20:52 crc kubenswrapper[4789]: I1216 08:20:52.871587 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:20:52 crc kubenswrapper[4789]: I1216 08:20:52.873880 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:20:52 crc kubenswrapper[4789]: I1216 08:20:52.875748 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 08:20:52 crc kubenswrapper[4789]: I1216 08:20:52.890088 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.014861 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588ebbe0-8963-4229-9b9d-2a0fcfab3300-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.015283 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.015348 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9m9w\" (UniqueName: \"kubernetes.io/projected/588ebbe0-8963-4229-9b9d-2a0fcfab3300-kube-api-access-x9m9w\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.015373 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-scripts\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.015421 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.015458 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.117273 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588ebbe0-8963-4229-9b9d-2a0fcfab3300-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.117585 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.117712 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9m9w\" (UniqueName: \"kubernetes.io/projected/588ebbe0-8963-4229-9b9d-2a0fcfab3300-kube-api-access-x9m9w\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.117799 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-scripts\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.117381 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588ebbe0-8963-4229-9b9d-2a0fcfab3300-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.118028 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.118152 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.124476 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.124625 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-scripts\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.125850 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.126316 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.136248 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9m9w\" (UniqueName: \"kubernetes.io/projected/588ebbe0-8963-4229-9b9d-2a0fcfab3300-kube-api-access-x9m9w\") pod \"cinder-scheduler-0\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.194777 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.679766 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:20:53 crc kubenswrapper[4789]: W1216 08:20:53.680892 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod588ebbe0_8963_4229_9b9d_2a0fcfab3300.slice/crio-c8c570a67c399544a8436bb548775cb8dfb461254fc7329d68bb3001070e8e71 WatchSource:0}: Error finding container c8c570a67c399544a8436bb548775cb8dfb461254fc7329d68bb3001070e8e71: Status 404 returned error can't find the container with id c8c570a67c399544a8436bb548775cb8dfb461254fc7329d68bb3001070e8e71 Dec 16 08:20:53 crc kubenswrapper[4789]: I1216 08:20:53.683317 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.120979 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.123475 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api-log" containerID="cri-o://7119954074edc2147b9f80c3e275ef3c65846c611f8d24020542151fdc24dfed" gracePeriod=30 Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.127247 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api" containerID="cri-o://45f8f9ae2e88c319d111ccd07788b7082f88c961f1415e62d8815ec42ac556f0" gracePeriod=30 Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.438803 4789 generic.go:334] "Generic (PLEG): container finished" podID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerID="7119954074edc2147b9f80c3e275ef3c65846c611f8d24020542151fdc24dfed" exitCode=143 Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.438865 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"302f90e8-3193-47ca-ada3-03a19e0d8f32","Type":"ContainerDied","Data":"7119954074edc2147b9f80c3e275ef3c65846c611f8d24020542151fdc24dfed"} Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.441619 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"588ebbe0-8963-4229-9b9d-2a0fcfab3300","Type":"ContainerStarted","Data":"c8c570a67c399544a8436bb548775cb8dfb461254fc7329d68bb3001070e8e71"} Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.535070 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.537557 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.543455 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.559896 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.667404 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.667977 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-run\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668009 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668037 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668057 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b26fb36c-42c2-4316-bab9-af89a7e7df12-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668089 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668151 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668192 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668392 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668700 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668770 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtrd\" (UniqueName: \"kubernetes.io/projected/b26fb36c-42c2-4316-bab9-af89a7e7df12-kube-api-access-9qtrd\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668823 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668850 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668903 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.668985 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771206 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771275 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-run\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771298 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771330 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771346 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b26fb36c-42c2-4316-bab9-af89a7e7df12-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771368 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771413 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771441 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771465 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771486 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771513 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771557 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtrd\" (UniqueName: \"kubernetes.io/projected/b26fb36c-42c2-4316-bab9-af89a7e7df12-kube-api-access-9qtrd\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771589 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771608 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771640 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771678 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.771993 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.772415 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.772551 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.772684 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-dev\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.773110 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-sys\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.773195 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-run\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.773256 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.773805 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.773874 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.776096 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b26fb36c-42c2-4316-bab9-af89a7e7df12-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.777109 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b26fb36c-42c2-4316-bab9-af89a7e7df12-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.778280 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.787502 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.787505 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.790009 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26fb36c-42c2-4316-bab9-af89a7e7df12-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.793227 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtrd\" (UniqueName: \"kubernetes.io/projected/b26fb36c-42c2-4316-bab9-af89a7e7df12-kube-api-access-9qtrd\") pod \"cinder-volume-volume1-0\" (UID: \"b26fb36c-42c2-4316-bab9-af89a7e7df12\") " pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:54 crc kubenswrapper[4789]: I1216 08:20:54.875193 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.416938 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.421450 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.426603 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.448561 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.450983 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"588ebbe0-8963-4229-9b9d-2a0fcfab3300","Type":"ContainerStarted","Data":"98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640"} Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.594470 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.594514 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.594541 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-lib-modules\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.594559 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-config-data\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.594821 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-dev\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.594966 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595131 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595322 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-scripts\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595407 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-run\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595449 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595598 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f480915-7f85-4e43-a3b6-63303a284b70-ceph\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595675 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-sys\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595730 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.595952 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.596095 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rz7\" (UniqueName: \"kubernetes.io/projected/5f480915-7f85-4e43-a3b6-63303a284b70-kube-api-access-89rz7\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.596310 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.649520 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698485 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-sys\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698543 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698762 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698783 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89rz7\" (UniqueName: \"kubernetes.io/projected/5f480915-7f85-4e43-a3b6-63303a284b70-kube-api-access-89rz7\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698793 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-sys\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698822 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698843 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698862 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698970 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699022 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-lib-modules\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698986 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698905 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-lib-modules\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.698970 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699069 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699092 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-config-data\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699141 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-dev\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699165 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699195 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699208 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-dev\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699238 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-scripts\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699269 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-run\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699279 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699317 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699390 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-run\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699434 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5f480915-7f85-4e43-a3b6-63303a284b70-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.699443 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f480915-7f85-4e43-a3b6-63303a284b70-ceph\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.712050 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-config-data\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.716676 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.717131 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f480915-7f85-4e43-a3b6-63303a284b70-ceph\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.717887 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.718528 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f480915-7f85-4e43-a3b6-63303a284b70-scripts\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.736804 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rz7\" (UniqueName: \"kubernetes.io/projected/5f480915-7f85-4e43-a3b6-63303a284b70-kube-api-access-89rz7\") pod \"cinder-backup-0\" (UID: \"5f480915-7f85-4e43-a3b6-63303a284b70\") " pod="openstack/cinder-backup-0" Dec 16 08:20:55 crc kubenswrapper[4789]: I1216 08:20:55.766322 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 16 08:20:56 crc kubenswrapper[4789]: I1216 08:20:56.385987 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 16 08:20:56 crc kubenswrapper[4789]: I1216 08:20:56.464800 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5f480915-7f85-4e43-a3b6-63303a284b70","Type":"ContainerStarted","Data":"21884f1851010dd894fea3d610c199dbcaaab93b6c535d7c287a93c50182e6b4"} Dec 16 08:20:56 crc kubenswrapper[4789]: I1216 08:20:56.468150 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"588ebbe0-8963-4229-9b9d-2a0fcfab3300","Type":"ContainerStarted","Data":"bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701"} Dec 16 08:20:56 crc kubenswrapper[4789]: I1216 08:20:56.472764 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b26fb36c-42c2-4316-bab9-af89a7e7df12","Type":"ContainerStarted","Data":"b8c04ec45d8a9fa3a415b0941b44fd19eda7ae408f2ab30c345b9343607b7d06"} Dec 16 08:20:56 crc kubenswrapper[4789]: I1216 08:20:56.472812 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b26fb36c-42c2-4316-bab9-af89a7e7df12","Type":"ContainerStarted","Data":"9ec31f29ccc054ca13941763d211b6e8b4cc5238f6cf66ae9e455bc96533c601"} Dec 16 08:20:56 crc kubenswrapper[4789]: I1216 08:20:56.506565 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.233897351 podStartE2EDuration="4.506539965s" podCreationTimestamp="2025-12-16 08:20:52 +0000 UTC" firstStartedPulling="2025-12-16 08:20:53.683027104 +0000 UTC m=+5391.944914733" lastFinishedPulling="2025-12-16 08:20:53.955669718 +0000 UTC m=+5392.217557347" observedRunningTime="2025-12-16 08:20:56.493348683 +0000 UTC m=+5394.755236332" watchObservedRunningTime="2025-12-16 08:20:56.506539965 +0000 UTC m=+5394.768427594" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.315158 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.79:8776/healthcheck\": read tcp 10.217.0.2:60388->10.217.1.79:8776: read: connection reset by peer" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.490136 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5f480915-7f85-4e43-a3b6-63303a284b70","Type":"ContainerStarted","Data":"e896a8659393c601dedf6c977d390a8610ea0af0dbd744f68ec34b0482af9620"} Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.490197 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5f480915-7f85-4e43-a3b6-63303a284b70","Type":"ContainerStarted","Data":"567aa565642db69dbf4835035e783549f12f12a5d45c9bbdc12a2866952b6594"} Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.493518 4789 generic.go:334] "Generic (PLEG): container finished" podID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerID="45f8f9ae2e88c319d111ccd07788b7082f88c961f1415e62d8815ec42ac556f0" exitCode=0 Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.493685 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"302f90e8-3193-47ca-ada3-03a19e0d8f32","Type":"ContainerDied","Data":"45f8f9ae2e88c319d111ccd07788b7082f88c961f1415e62d8815ec42ac556f0"} Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.498293 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"b26fb36c-42c2-4316-bab9-af89a7e7df12","Type":"ContainerStarted","Data":"3f201f36dc70ee419e415de42627ec4a7687ec155b0a942d6633d4f0e131ae01"} Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.539395 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.035107485 podStartE2EDuration="2.539372913s" podCreationTimestamp="2025-12-16 08:20:55 +0000 UTC" firstStartedPulling="2025-12-16 08:20:56.382022191 +0000 UTC m=+5394.643909830" lastFinishedPulling="2025-12-16 08:20:56.886287629 +0000 UTC m=+5395.148175258" observedRunningTime="2025-12-16 08:20:57.511500822 +0000 UTC m=+5395.773388451" watchObservedRunningTime="2025-12-16 08:20:57.539372913 +0000 UTC m=+5395.801260542" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.560936 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.177722451 podStartE2EDuration="3.560902009s" podCreationTimestamp="2025-12-16 08:20:54 +0000 UTC" firstStartedPulling="2025-12-16 08:20:55.649027203 +0000 UTC m=+5393.910914832" lastFinishedPulling="2025-12-16 08:20:56.032206761 +0000 UTC m=+5394.294094390" observedRunningTime="2025-12-16 08:20:57.543461193 +0000 UTC m=+5395.805348822" watchObservedRunningTime="2025-12-16 08:20:57.560902009 +0000 UTC m=+5395.822789638" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.654732 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.655259 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.656880 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.666612 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.704424 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.707332 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.714402 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.739551 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.876203 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data\") pod \"302f90e8-3193-47ca-ada3-03a19e0d8f32\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.876410 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/302f90e8-3193-47ca-ada3-03a19e0d8f32-etc-machine-id\") pod \"302f90e8-3193-47ca-ada3-03a19e0d8f32\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.876458 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwb4t\" (UniqueName: \"kubernetes.io/projected/302f90e8-3193-47ca-ada3-03a19e0d8f32-kube-api-access-xwb4t\") pod \"302f90e8-3193-47ca-ada3-03a19e0d8f32\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.876504 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302f90e8-3193-47ca-ada3-03a19e0d8f32-logs\") pod \"302f90e8-3193-47ca-ada3-03a19e0d8f32\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.876581 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-scripts\") pod \"302f90e8-3193-47ca-ada3-03a19e0d8f32\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.876612 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-combined-ca-bundle\") pod \"302f90e8-3193-47ca-ada3-03a19e0d8f32\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.876666 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data-custom\") pod \"302f90e8-3193-47ca-ada3-03a19e0d8f32\" (UID: \"302f90e8-3193-47ca-ada3-03a19e0d8f32\") " Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.878043 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302f90e8-3193-47ca-ada3-03a19e0d8f32-logs" (OuterVolumeSpecName: "logs") pod "302f90e8-3193-47ca-ada3-03a19e0d8f32" (UID: "302f90e8-3193-47ca-ada3-03a19e0d8f32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.878104 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/302f90e8-3193-47ca-ada3-03a19e0d8f32-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "302f90e8-3193-47ca-ada3-03a19e0d8f32" (UID: "302f90e8-3193-47ca-ada3-03a19e0d8f32"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.891091 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "302f90e8-3193-47ca-ada3-03a19e0d8f32" (UID: "302f90e8-3193-47ca-ada3-03a19e0d8f32"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.892257 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302f90e8-3193-47ca-ada3-03a19e0d8f32-kube-api-access-xwb4t" (OuterVolumeSpecName: "kube-api-access-xwb4t") pod "302f90e8-3193-47ca-ada3-03a19e0d8f32" (UID: "302f90e8-3193-47ca-ada3-03a19e0d8f32"). InnerVolumeSpecName "kube-api-access-xwb4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.892886 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-scripts" (OuterVolumeSpecName: "scripts") pod "302f90e8-3193-47ca-ada3-03a19e0d8f32" (UID: "302f90e8-3193-47ca-ada3-03a19e0d8f32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.928348 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "302f90e8-3193-47ca-ada3-03a19e0d8f32" (UID: "302f90e8-3193-47ca-ada3-03a19e0d8f32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.953254 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data" (OuterVolumeSpecName: "config-data") pod "302f90e8-3193-47ca-ada3-03a19e0d8f32" (UID: "302f90e8-3193-47ca-ada3-03a19e0d8f32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.978525 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.978558 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/302f90e8-3193-47ca-ada3-03a19e0d8f32-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.978571 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwb4t\" (UniqueName: \"kubernetes.io/projected/302f90e8-3193-47ca-ada3-03a19e0d8f32-kube-api-access-xwb4t\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.978581 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302f90e8-3193-47ca-ada3-03a19e0d8f32-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.978588 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.978596 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:57 crc kubenswrapper[4789]: I1216 08:20:57.978605 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/302f90e8-3193-47ca-ada3-03a19e0d8f32-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.195857 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.516812 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.519628 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"302f90e8-3193-47ca-ada3-03a19e0d8f32","Type":"ContainerDied","Data":"9353c0029bb8388061a7388debfb0be0aead88ac59730e7574087aa33b4c4599"} Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.520047 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.520152 4789 scope.go:117] "RemoveContainer" containerID="45f8f9ae2e88c319d111ccd07788b7082f88c961f1415e62d8815ec42ac556f0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.528195 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.528857 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.584305 4789 scope.go:117] "RemoveContainer" containerID="7119954074edc2147b9f80c3e275ef3c65846c611f8d24020542151fdc24dfed" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.637029 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.659408 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.670896 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:58 crc kubenswrapper[4789]: E1216 08:20:58.671424 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api-log" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.671451 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api-log" Dec 16 08:20:58 crc kubenswrapper[4789]: E1216 08:20:58.671490 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.671499 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.671725 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.671750 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" containerName="cinder-api-log" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.673088 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.681356 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.703529 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bb6971-b904-45a9-92a2-fda570c52dcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.703605 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.703659 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bb6971-b904-45a9-92a2-fda570c52dcd-logs\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.703731 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.703767 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-scripts\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.704029 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-config-data\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.704139 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65drk\" (UniqueName: \"kubernetes.io/projected/04bb6971-b904-45a9-92a2-fda570c52dcd-kube-api-access-65drk\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.710162 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.806686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bb6971-b904-45a9-92a2-fda570c52dcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.806774 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.806826 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bb6971-b904-45a9-92a2-fda570c52dcd-logs\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.806858 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.806921 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-scripts\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.806975 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-config-data\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.807007 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65drk\" (UniqueName: \"kubernetes.io/projected/04bb6971-b904-45a9-92a2-fda570c52dcd-kube-api-access-65drk\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.807156 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bb6971-b904-45a9-92a2-fda570c52dcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.807752 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bb6971-b904-45a9-92a2-fda570c52dcd-logs\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.824048 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.824129 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-config-data\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.826962 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.832201 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bb6971-b904-45a9-92a2-fda570c52dcd-scripts\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.837129 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65drk\" (UniqueName: \"kubernetes.io/projected/04bb6971-b904-45a9-92a2-fda570c52dcd-kube-api-access-65drk\") pod \"cinder-api-0\" (UID: \"04bb6971-b904-45a9-92a2-fda570c52dcd\") " pod="openstack/cinder-api-0" Dec 16 08:20:58 crc kubenswrapper[4789]: I1216 08:20:58.999548 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 08:20:59 crc kubenswrapper[4789]: I1216 08:20:59.106792 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:20:59 crc kubenswrapper[4789]: E1216 08:20:59.107084 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:20:59 crc kubenswrapper[4789]: W1216 08:20:59.576204 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04bb6971_b904_45a9_92a2_fda570c52dcd.slice/crio-50236f2b828bb93fecaee6f6c9beeea035f49f0d0109629425ba606fcace2b8f WatchSource:0}: Error finding container 50236f2b828bb93fecaee6f6c9beeea035f49f0d0109629425ba606fcace2b8f: Status 404 returned error can't find the container with id 50236f2b828bb93fecaee6f6c9beeea035f49f0d0109629425ba606fcace2b8f Dec 16 08:20:59 crc kubenswrapper[4789]: I1216 08:20:59.582245 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 08:20:59 crc kubenswrapper[4789]: I1216 08:20:59.876146 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 16 08:21:00 crc kubenswrapper[4789]: I1216 08:21:00.117788 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302f90e8-3193-47ca-ada3-03a19e0d8f32" path="/var/lib/kubelet/pods/302f90e8-3193-47ca-ada3-03a19e0d8f32/volumes" Dec 16 08:21:00 crc kubenswrapper[4789]: I1216 08:21:00.536243 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"04bb6971-b904-45a9-92a2-fda570c52dcd","Type":"ContainerStarted","Data":"b878588296c20fea694b5736c7bd64a334d5b5d08e6e6d0ab76296ff783d642a"} Dec 16 08:21:00 crc kubenswrapper[4789]: I1216 08:21:00.536612 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"04bb6971-b904-45a9-92a2-fda570c52dcd","Type":"ContainerStarted","Data":"50236f2b828bb93fecaee6f6c9beeea035f49f0d0109629425ba606fcace2b8f"} Dec 16 08:21:00 crc kubenswrapper[4789]: I1216 08:21:00.767078 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 16 08:21:01 crc kubenswrapper[4789]: I1216 08:21:01.559854 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"04bb6971-b904-45a9-92a2-fda570c52dcd","Type":"ContainerStarted","Data":"0a955f119d6cd324370b42c26fb77aa8ca670d4d41cf9157df1fd1b914b27d4f"} Dec 16 08:21:01 crc kubenswrapper[4789]: I1216 08:21:01.560180 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 08:21:03 crc kubenswrapper[4789]: I1216 08:21:03.407141 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 08:21:03 crc kubenswrapper[4789]: I1216 08:21:03.429387 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.429350046 podStartE2EDuration="5.429350046s" podCreationTimestamp="2025-12-16 08:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:21:01.590691729 +0000 UTC m=+5399.852579358" watchObservedRunningTime="2025-12-16 08:21:03.429350046 +0000 UTC m=+5401.691237675" Dec 16 08:21:03 crc kubenswrapper[4789]: I1216 08:21:03.471410 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:21:03 crc kubenswrapper[4789]: I1216 08:21:03.582136 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="cinder-scheduler" containerID="cri-o://98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640" gracePeriod=30 Dec 16 08:21:03 crc kubenswrapper[4789]: I1216 08:21:03.582219 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="probe" containerID="cri-o://bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701" gracePeriod=30 Dec 16 08:21:04 crc kubenswrapper[4789]: I1216 08:21:04.592942 4789 generic.go:334] "Generic (PLEG): container finished" podID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerID="bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701" exitCode=0 Dec 16 08:21:04 crc kubenswrapper[4789]: I1216 08:21:04.592944 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"588ebbe0-8963-4229-9b9d-2a0fcfab3300","Type":"ContainerDied","Data":"bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701"} Dec 16 08:21:05 crc kubenswrapper[4789]: I1216 08:21:05.088719 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 16 08:21:05 crc kubenswrapper[4789]: I1216 08:21:05.963315 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.368937 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.545530 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data\") pod \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.545614 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9m9w\" (UniqueName: \"kubernetes.io/projected/588ebbe0-8963-4229-9b9d-2a0fcfab3300-kube-api-access-x9m9w\") pod \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.545729 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588ebbe0-8963-4229-9b9d-2a0fcfab3300-etc-machine-id\") pod \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.545750 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-combined-ca-bundle\") pod \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.545774 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-scripts\") pod \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.545809 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/588ebbe0-8963-4229-9b9d-2a0fcfab3300-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "588ebbe0-8963-4229-9b9d-2a0fcfab3300" (UID: "588ebbe0-8963-4229-9b9d-2a0fcfab3300"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.545864 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data-custom\") pod \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\" (UID: \"588ebbe0-8963-4229-9b9d-2a0fcfab3300\") " Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.546233 4789 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/588ebbe0-8963-4229-9b9d-2a0fcfab3300-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.552547 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588ebbe0-8963-4229-9b9d-2a0fcfab3300-kube-api-access-x9m9w" (OuterVolumeSpecName: "kube-api-access-x9m9w") pod "588ebbe0-8963-4229-9b9d-2a0fcfab3300" (UID: "588ebbe0-8963-4229-9b9d-2a0fcfab3300"). InnerVolumeSpecName "kube-api-access-x9m9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.563105 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-scripts" (OuterVolumeSpecName: "scripts") pod "588ebbe0-8963-4229-9b9d-2a0fcfab3300" (UID: "588ebbe0-8963-4229-9b9d-2a0fcfab3300"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.563271 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "588ebbe0-8963-4229-9b9d-2a0fcfab3300" (UID: "588ebbe0-8963-4229-9b9d-2a0fcfab3300"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.604850 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "588ebbe0-8963-4229-9b9d-2a0fcfab3300" (UID: "588ebbe0-8963-4229-9b9d-2a0fcfab3300"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.612184 4789 generic.go:334] "Generic (PLEG): container finished" podID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerID="98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640" exitCode=0 Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.612233 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"588ebbe0-8963-4229-9b9d-2a0fcfab3300","Type":"ContainerDied","Data":"98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640"} Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.612264 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"588ebbe0-8963-4229-9b9d-2a0fcfab3300","Type":"ContainerDied","Data":"c8c570a67c399544a8436bb548775cb8dfb461254fc7329d68bb3001070e8e71"} Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.612284 4789 scope.go:117] "RemoveContainer" containerID="bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.612455 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.646665 4789 scope.go:117] "RemoveContainer" containerID="98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.647901 4789 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.647996 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9m9w\" (UniqueName: \"kubernetes.io/projected/588ebbe0-8963-4229-9b9d-2a0fcfab3300-kube-api-access-x9m9w\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.648008 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.648016 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.660111 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data" (OuterVolumeSpecName: "config-data") pod "588ebbe0-8963-4229-9b9d-2a0fcfab3300" (UID: "588ebbe0-8963-4229-9b9d-2a0fcfab3300"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.670129 4789 scope.go:117] "RemoveContainer" containerID="bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701" Dec 16 08:21:06 crc kubenswrapper[4789]: E1216 08:21:06.670695 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701\": container with ID starting with bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701 not found: ID does not exist" containerID="bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.670777 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701"} err="failed to get container status \"bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701\": rpc error: code = NotFound desc = could not find container \"bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701\": container with ID starting with bcfaddd78aa48219694daeb3de9742d802ac146fc778a7578c5a3cb9d8598701 not found: ID does not exist" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.670803 4789 scope.go:117] "RemoveContainer" containerID="98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640" Dec 16 08:21:06 crc kubenswrapper[4789]: E1216 08:21:06.671320 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640\": container with ID starting with 98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640 not found: ID does not exist" containerID="98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.671346 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640"} err="failed to get container status \"98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640\": rpc error: code = NotFound desc = could not find container \"98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640\": container with ID starting with 98bba63fe76f718d479314ca48fb968ac57ef97a0d6da5cdc4fe2fb4a79d0640 not found: ID does not exist" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.749854 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588ebbe0-8963-4229-9b9d-2a0fcfab3300-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.944048 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.951471 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.969762 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:21:06 crc kubenswrapper[4789]: E1216 08:21:06.970369 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="cinder-scheduler" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.970388 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="cinder-scheduler" Dec 16 08:21:06 crc kubenswrapper[4789]: E1216 08:21:06.970407 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="probe" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.970415 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="probe" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.970643 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="probe" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.970678 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" containerName="cinder-scheduler" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.971892 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.974433 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 08:21:06 crc kubenswrapper[4789]: I1216 08:21:06.982637 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.156334 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.156429 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-scripts\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.156516 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7n49\" (UniqueName: \"kubernetes.io/projected/f1ad6601-c17e-4847-b540-8bc8d8997934-kube-api-access-g7n49\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.156556 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.156575 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-config-data\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.156874 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1ad6601-c17e-4847-b540-8bc8d8997934-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.258611 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.258864 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-config-data\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.259082 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1ad6601-c17e-4847-b540-8bc8d8997934-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.259231 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.259376 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-scripts\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.259521 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7n49\" (UniqueName: \"kubernetes.io/projected/f1ad6601-c17e-4847-b540-8bc8d8997934-kube-api-access-g7n49\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.259140 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1ad6601-c17e-4847-b540-8bc8d8997934-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.262051 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.262644 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.263414 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-config-data\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.276420 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1ad6601-c17e-4847-b540-8bc8d8997934-scripts\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.277022 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7n49\" (UniqueName: \"kubernetes.io/projected/f1ad6601-c17e-4847-b540-8bc8d8997934-kube-api-access-g7n49\") pod \"cinder-scheduler-0\" (UID: \"f1ad6601-c17e-4847-b540-8bc8d8997934\") " pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.292141 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 08:21:07 crc kubenswrapper[4789]: I1216 08:21:07.710950 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 08:21:07 crc kubenswrapper[4789]: W1216 08:21:07.711384 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1ad6601_c17e_4847_b540_8bc8d8997934.slice/crio-1ecfd8a10412f5b77dc012fbeeb53c160d9fe7c3e555c8b4bc32a7afbbc47624 WatchSource:0}: Error finding container 1ecfd8a10412f5b77dc012fbeeb53c160d9fe7c3e555c8b4bc32a7afbbc47624: Status 404 returned error can't find the container with id 1ecfd8a10412f5b77dc012fbeeb53c160d9fe7c3e555c8b4bc32a7afbbc47624 Dec 16 08:21:08 crc kubenswrapper[4789]: I1216 08:21:08.117216 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588ebbe0-8963-4229-9b9d-2a0fcfab3300" path="/var/lib/kubelet/pods/588ebbe0-8963-4229-9b9d-2a0fcfab3300/volumes" Dec 16 08:21:08 crc kubenswrapper[4789]: I1216 08:21:08.644378 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f1ad6601-c17e-4847-b540-8bc8d8997934","Type":"ContainerStarted","Data":"5eb63b7df96f447697dfb7d12c4c67228284e75c1880eb4888882a656e033e46"} Dec 16 08:21:08 crc kubenswrapper[4789]: I1216 08:21:08.644432 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f1ad6601-c17e-4847-b540-8bc8d8997934","Type":"ContainerStarted","Data":"1ecfd8a10412f5b77dc012fbeeb53c160d9fe7c3e555c8b4bc32a7afbbc47624"} Dec 16 08:21:09 crc kubenswrapper[4789]: I1216 08:21:09.654700 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f1ad6601-c17e-4847-b540-8bc8d8997934","Type":"ContainerStarted","Data":"9f5f033d287525e13642660eccd462ff4ee633f1d2064b1b4dc7e56b22e7c168"} Dec 16 08:21:09 crc kubenswrapper[4789]: I1216 08:21:09.672283 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.672193675 podStartE2EDuration="3.672193675s" podCreationTimestamp="2025-12-16 08:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:21:09.671249952 +0000 UTC m=+5407.933137601" watchObservedRunningTime="2025-12-16 08:21:09.672193675 +0000 UTC m=+5407.934081304" Dec 16 08:21:10 crc kubenswrapper[4789]: I1216 08:21:10.802508 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 08:21:11 crc kubenswrapper[4789]: I1216 08:21:11.105560 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:21:11 crc kubenswrapper[4789]: E1216 08:21:11.105855 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:21:12 crc kubenswrapper[4789]: I1216 08:21:12.301215 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.168386 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ch8sj"] Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.170562 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.183108 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch8sj"] Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.331412 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46xf\" (UniqueName: \"kubernetes.io/projected/806e978e-5828-438e-af87-9ee66f148ebc-kube-api-access-r46xf\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.331481 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-catalog-content\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.331627 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-utilities\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.433224 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-utilities\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.433303 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r46xf\" (UniqueName: \"kubernetes.io/projected/806e978e-5828-438e-af87-9ee66f148ebc-kube-api-access-r46xf\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.433331 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-catalog-content\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.433807 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-catalog-content\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.434063 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-utilities\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.456327 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r46xf\" (UniqueName: \"kubernetes.io/projected/806e978e-5828-438e-af87-9ee66f148ebc-kube-api-access-r46xf\") pod \"community-operators-ch8sj\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:15 crc kubenswrapper[4789]: I1216 08:21:15.492986 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:16 crc kubenswrapper[4789]: W1216 08:21:16.034445 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod806e978e_5828_438e_af87_9ee66f148ebc.slice/crio-7145cf8f4e23bf47d8c6c4dfc5b831b45f411d9c07b7e4c8999d385af24851f1 WatchSource:0}: Error finding container 7145cf8f4e23bf47d8c6c4dfc5b831b45f411d9c07b7e4c8999d385af24851f1: Status 404 returned error can't find the container with id 7145cf8f4e23bf47d8c6c4dfc5b831b45f411d9c07b7e4c8999d385af24851f1 Dec 16 08:21:16 crc kubenswrapper[4789]: I1216 08:21:16.035032 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ch8sj"] Dec 16 08:21:16 crc kubenswrapper[4789]: I1216 08:21:16.719724 4789 generic.go:334] "Generic (PLEG): container finished" podID="806e978e-5828-438e-af87-9ee66f148ebc" containerID="76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f" exitCode=0 Dec 16 08:21:16 crc kubenswrapper[4789]: I1216 08:21:16.719832 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch8sj" event={"ID":"806e978e-5828-438e-af87-9ee66f148ebc","Type":"ContainerDied","Data":"76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f"} Dec 16 08:21:16 crc kubenswrapper[4789]: I1216 08:21:16.720090 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch8sj" event={"ID":"806e978e-5828-438e-af87-9ee66f148ebc","Type":"ContainerStarted","Data":"7145cf8f4e23bf47d8c6c4dfc5b831b45f411d9c07b7e4c8999d385af24851f1"} Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.491865 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.595653 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlp9"] Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.598127 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.612170 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlp9"] Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.689510 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-catalog-content\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.689580 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dpp\" (UniqueName: \"kubernetes.io/projected/83824d14-2e2e-42a7-a3d8-dc25657a761b-kube-api-access-75dpp\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.689664 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-utilities\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.793047 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-catalog-content\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.793133 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75dpp\" (UniqueName: \"kubernetes.io/projected/83824d14-2e2e-42a7-a3d8-dc25657a761b-kube-api-access-75dpp\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.793227 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-utilities\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.793657 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-catalog-content\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.800071 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-utilities\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.815708 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dpp\" (UniqueName: \"kubernetes.io/projected/83824d14-2e2e-42a7-a3d8-dc25657a761b-kube-api-access-75dpp\") pod \"redhat-marketplace-hqlp9\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:17 crc kubenswrapper[4789]: I1216 08:21:17.926565 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.180373 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5vkf"] Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.183002 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.209772 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5vkf"] Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.303049 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/ef91c187-dca6-41ad-9846-57e753d84328-kube-api-access-64f2j\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.303166 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-utilities\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.303208 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-catalog-content\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.383965 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlp9"] Dec 16 08:21:18 crc kubenswrapper[4789]: W1216 08:21:18.393580 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83824d14_2e2e_42a7_a3d8_dc25657a761b.slice/crio-2f079069d127dc50c0ea47a7ab6937df90bdc488e0c252abf6907e3243d05e94 WatchSource:0}: Error finding container 2f079069d127dc50c0ea47a7ab6937df90bdc488e0c252abf6907e3243d05e94: Status 404 returned error can't find the container with id 2f079069d127dc50c0ea47a7ab6937df90bdc488e0c252abf6907e3243d05e94 Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.405160 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-utilities\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.405206 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-catalog-content\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.405303 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/ef91c187-dca6-41ad-9846-57e753d84328-kube-api-access-64f2j\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.406094 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-utilities\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.410125 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-catalog-content\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.425745 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/ef91c187-dca6-41ad-9846-57e753d84328-kube-api-access-64f2j\") pod \"redhat-operators-t5vkf\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.505787 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.748723 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch8sj" event={"ID":"806e978e-5828-438e-af87-9ee66f148ebc","Type":"ContainerStarted","Data":"36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114"} Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.753814 4789 generic.go:334] "Generic (PLEG): container finished" podID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerID="b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823" exitCode=0 Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.753846 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlp9" event={"ID":"83824d14-2e2e-42a7-a3d8-dc25657a761b","Type":"ContainerDied","Data":"b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823"} Dec 16 08:21:18 crc kubenswrapper[4789]: I1216 08:21:18.753883 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlp9" event={"ID":"83824d14-2e2e-42a7-a3d8-dc25657a761b","Type":"ContainerStarted","Data":"2f079069d127dc50c0ea47a7ab6937df90bdc488e0c252abf6907e3243d05e94"} Dec 16 08:21:19 crc kubenswrapper[4789]: I1216 08:21:19.003566 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5vkf"] Dec 16 08:21:19 crc kubenswrapper[4789]: I1216 08:21:19.764426 4789 generic.go:334] "Generic (PLEG): container finished" podID="ef91c187-dca6-41ad-9846-57e753d84328" containerID="919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0" exitCode=0 Dec 16 08:21:19 crc kubenswrapper[4789]: I1216 08:21:19.764544 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5vkf" event={"ID":"ef91c187-dca6-41ad-9846-57e753d84328","Type":"ContainerDied","Data":"919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0"} Dec 16 08:21:19 crc kubenswrapper[4789]: I1216 08:21:19.765136 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5vkf" event={"ID":"ef91c187-dca6-41ad-9846-57e753d84328","Type":"ContainerStarted","Data":"f28ffd5cc5cbad43eeee548ae1f5f4f227ac986d2c4c0640816c2fe6f33b9ec0"} Dec 16 08:21:19 crc kubenswrapper[4789]: I1216 08:21:19.774747 4789 generic.go:334] "Generic (PLEG): container finished" podID="806e978e-5828-438e-af87-9ee66f148ebc" containerID="36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114" exitCode=0 Dec 16 08:21:19 crc kubenswrapper[4789]: I1216 08:21:19.774799 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch8sj" event={"ID":"806e978e-5828-438e-af87-9ee66f148ebc","Type":"ContainerDied","Data":"36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114"} Dec 16 08:21:20 crc kubenswrapper[4789]: I1216 08:21:20.786017 4789 generic.go:334] "Generic (PLEG): container finished" podID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerID="6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2" exitCode=0 Dec 16 08:21:20 crc kubenswrapper[4789]: I1216 08:21:20.786283 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlp9" event={"ID":"83824d14-2e2e-42a7-a3d8-dc25657a761b","Type":"ContainerDied","Data":"6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2"} Dec 16 08:21:21 crc kubenswrapper[4789]: I1216 08:21:21.808778 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch8sj" event={"ID":"806e978e-5828-438e-af87-9ee66f148ebc","Type":"ContainerStarted","Data":"b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47"} Dec 16 08:21:21 crc kubenswrapper[4789]: I1216 08:21:21.812107 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5vkf" event={"ID":"ef91c187-dca6-41ad-9846-57e753d84328","Type":"ContainerStarted","Data":"16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2"} Dec 16 08:21:21 crc kubenswrapper[4789]: I1216 08:21:21.833081 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ch8sj" podStartSLOduration=2.778424324 podStartE2EDuration="6.833065142s" podCreationTimestamp="2025-12-16 08:21:15 +0000 UTC" firstStartedPulling="2025-12-16 08:21:16.722352369 +0000 UTC m=+5414.984239998" lastFinishedPulling="2025-12-16 08:21:20.776993187 +0000 UTC m=+5419.038880816" observedRunningTime="2025-12-16 08:21:21.826650525 +0000 UTC m=+5420.088538154" watchObservedRunningTime="2025-12-16 08:21:21.833065142 +0000 UTC m=+5420.094952771" Dec 16 08:21:22 crc kubenswrapper[4789]: I1216 08:21:22.821052 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlp9" event={"ID":"83824d14-2e2e-42a7-a3d8-dc25657a761b","Type":"ContainerStarted","Data":"8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd"} Dec 16 08:21:22 crc kubenswrapper[4789]: I1216 08:21:22.843421 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hqlp9" podStartSLOduration=2.767017877 podStartE2EDuration="5.84340372s" podCreationTimestamp="2025-12-16 08:21:17 +0000 UTC" firstStartedPulling="2025-12-16 08:21:18.755656994 +0000 UTC m=+5417.017544623" lastFinishedPulling="2025-12-16 08:21:21.832042837 +0000 UTC m=+5420.093930466" observedRunningTime="2025-12-16 08:21:22.837706981 +0000 UTC m=+5421.099594610" watchObservedRunningTime="2025-12-16 08:21:22.84340372 +0000 UTC m=+5421.105291349" Dec 16 08:21:23 crc kubenswrapper[4789]: I1216 08:21:23.105510 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:21:23 crc kubenswrapper[4789]: E1216 08:21:23.106118 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:21:24 crc kubenswrapper[4789]: I1216 08:21:24.841566 4789 generic.go:334] "Generic (PLEG): container finished" podID="ef91c187-dca6-41ad-9846-57e753d84328" containerID="16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2" exitCode=0 Dec 16 08:21:24 crc kubenswrapper[4789]: I1216 08:21:24.841636 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5vkf" event={"ID":"ef91c187-dca6-41ad-9846-57e753d84328","Type":"ContainerDied","Data":"16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2"} Dec 16 08:21:25 crc kubenswrapper[4789]: I1216 08:21:25.494761 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:25 crc kubenswrapper[4789]: I1216 08:21:25.495165 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:26 crc kubenswrapper[4789]: I1216 08:21:26.541723 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ch8sj" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="registry-server" probeResult="failure" output=< Dec 16 08:21:26 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 08:21:26 crc kubenswrapper[4789]: > Dec 16 08:21:27 crc kubenswrapper[4789]: I1216 08:21:27.873715 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5vkf" event={"ID":"ef91c187-dca6-41ad-9846-57e753d84328","Type":"ContainerStarted","Data":"de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4"} Dec 16 08:21:27 crc kubenswrapper[4789]: I1216 08:21:27.898579 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5vkf" podStartSLOduration=2.5328632989999997 podStartE2EDuration="9.898560046s" podCreationTimestamp="2025-12-16 08:21:18 +0000 UTC" firstStartedPulling="2025-12-16 08:21:19.766206597 +0000 UTC m=+5418.028094216" lastFinishedPulling="2025-12-16 08:21:27.131903334 +0000 UTC m=+5425.393790963" observedRunningTime="2025-12-16 08:21:27.889947615 +0000 UTC m=+5426.151835244" watchObservedRunningTime="2025-12-16 08:21:27.898560046 +0000 UTC m=+5426.160447675" Dec 16 08:21:27 crc kubenswrapper[4789]: I1216 08:21:27.927148 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:27 crc kubenswrapper[4789]: I1216 08:21:27.927204 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:27 crc kubenswrapper[4789]: I1216 08:21:27.975421 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:28 crc kubenswrapper[4789]: I1216 08:21:28.506467 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:28 crc kubenswrapper[4789]: I1216 08:21:28.506520 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:28 crc kubenswrapper[4789]: I1216 08:21:28.924469 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:29 crc kubenswrapper[4789]: I1216 08:21:29.047280 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fffh6"] Dec 16 08:21:29 crc kubenswrapper[4789]: I1216 08:21:29.055873 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fffh6"] Dec 16 08:21:29 crc kubenswrapper[4789]: I1216 08:21:29.547677 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t5vkf" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="registry-server" probeResult="failure" output=< Dec 16 08:21:29 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 08:21:29 crc kubenswrapper[4789]: > Dec 16 08:21:29 crc kubenswrapper[4789]: I1216 08:21:29.561157 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlp9"] Dec 16 08:21:30 crc kubenswrapper[4789]: I1216 08:21:30.041947 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-709f-account-create-update-vw46j"] Dec 16 08:21:30 crc kubenswrapper[4789]: I1216 08:21:30.050100 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-709f-account-create-update-vw46j"] Dec 16 08:21:30 crc kubenswrapper[4789]: I1216 08:21:30.114294 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201ce99d-6f07-4de4-a84e-ce221215a532" path="/var/lib/kubelet/pods/201ce99d-6f07-4de4-a84e-ce221215a532/volumes" Dec 16 08:21:30 crc kubenswrapper[4789]: I1216 08:21:30.114858 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad970bb-e71b-431c-ae77-d85133144832" path="/var/lib/kubelet/pods/aad970bb-e71b-431c-ae77-d85133144832/volumes" Dec 16 08:21:30 crc kubenswrapper[4789]: I1216 08:21:30.288198 4789 scope.go:117] "RemoveContainer" containerID="ad6dce7d1055a8a8e4d640acae27762ad6b5495e5b83fa99a838a7bc2eedc9bb" Dec 16 08:21:30 crc kubenswrapper[4789]: I1216 08:21:30.313598 4789 scope.go:117] "RemoveContainer" containerID="7eea37e3283a63fbfa33adb2ac6bcc7bce7091e96b48f61a59c32234fc3413da" Dec 16 08:21:30 crc kubenswrapper[4789]: I1216 08:21:30.897741 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hqlp9" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="registry-server" containerID="cri-o://8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd" gracePeriod=2 Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.350926 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.548156 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75dpp\" (UniqueName: \"kubernetes.io/projected/83824d14-2e2e-42a7-a3d8-dc25657a761b-kube-api-access-75dpp\") pod \"83824d14-2e2e-42a7-a3d8-dc25657a761b\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.548334 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-catalog-content\") pod \"83824d14-2e2e-42a7-a3d8-dc25657a761b\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.548446 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-utilities\") pod \"83824d14-2e2e-42a7-a3d8-dc25657a761b\" (UID: \"83824d14-2e2e-42a7-a3d8-dc25657a761b\") " Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.549147 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-utilities" (OuterVolumeSpecName: "utilities") pod "83824d14-2e2e-42a7-a3d8-dc25657a761b" (UID: "83824d14-2e2e-42a7-a3d8-dc25657a761b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.554491 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83824d14-2e2e-42a7-a3d8-dc25657a761b-kube-api-access-75dpp" (OuterVolumeSpecName: "kube-api-access-75dpp") pod "83824d14-2e2e-42a7-a3d8-dc25657a761b" (UID: "83824d14-2e2e-42a7-a3d8-dc25657a761b"). InnerVolumeSpecName "kube-api-access-75dpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.570634 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83824d14-2e2e-42a7-a3d8-dc25657a761b" (UID: "83824d14-2e2e-42a7-a3d8-dc25657a761b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.650960 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.651025 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75dpp\" (UniqueName: \"kubernetes.io/projected/83824d14-2e2e-42a7-a3d8-dc25657a761b-kube-api-access-75dpp\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.651039 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83824d14-2e2e-42a7-a3d8-dc25657a761b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.908579 4789 generic.go:334] "Generic (PLEG): container finished" podID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerID="8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd" exitCode=0 Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.908823 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlp9" event={"ID":"83824d14-2e2e-42a7-a3d8-dc25657a761b","Type":"ContainerDied","Data":"8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd"} Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.908937 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqlp9" event={"ID":"83824d14-2e2e-42a7-a3d8-dc25657a761b","Type":"ContainerDied","Data":"2f079069d127dc50c0ea47a7ab6937df90bdc488e0c252abf6907e3243d05e94"} Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.909032 4789 scope.go:117] "RemoveContainer" containerID="8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.909233 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqlp9" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.936144 4789 scope.go:117] "RemoveContainer" containerID="6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2" Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.950036 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlp9"] Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.954868 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqlp9"] Dec 16 08:21:31 crc kubenswrapper[4789]: I1216 08:21:31.975668 4789 scope.go:117] "RemoveContainer" containerID="b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823" Dec 16 08:21:32 crc kubenswrapper[4789]: I1216 08:21:32.015792 4789 scope.go:117] "RemoveContainer" containerID="8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd" Dec 16 08:21:32 crc kubenswrapper[4789]: E1216 08:21:32.016289 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd\": container with ID starting with 8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd not found: ID does not exist" containerID="8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd" Dec 16 08:21:32 crc kubenswrapper[4789]: I1216 08:21:32.016329 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd"} err="failed to get container status \"8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd\": rpc error: code = NotFound desc = could not find container \"8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd\": container with ID starting with 8d8a137a43394ef2ad939f7601c6b5be53d72842278e8d23c67d5abdf89ac1fd not found: ID does not exist" Dec 16 08:21:32 crc kubenswrapper[4789]: I1216 08:21:32.016355 4789 scope.go:117] "RemoveContainer" containerID="6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2" Dec 16 08:21:32 crc kubenswrapper[4789]: E1216 08:21:32.016766 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2\": container with ID starting with 6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2 not found: ID does not exist" containerID="6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2" Dec 16 08:21:32 crc kubenswrapper[4789]: I1216 08:21:32.016799 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2"} err="failed to get container status \"6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2\": rpc error: code = NotFound desc = could not find container \"6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2\": container with ID starting with 6295463a80fa0e66c70aa3f4305bd21024db311861fc36ac3ee3bdeb3dbc0ce2 not found: ID does not exist" Dec 16 08:21:32 crc kubenswrapper[4789]: I1216 08:21:32.016839 4789 scope.go:117] "RemoveContainer" containerID="b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823" Dec 16 08:21:32 crc kubenswrapper[4789]: E1216 08:21:32.017196 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823\": container with ID starting with b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823 not found: ID does not exist" containerID="b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823" Dec 16 08:21:32 crc kubenswrapper[4789]: I1216 08:21:32.017221 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823"} err="failed to get container status \"b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823\": rpc error: code = NotFound desc = could not find container \"b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823\": container with ID starting with b48a580e0f602a8dcde360ae9562b6a99863ba4f0b12c5c205c2bc93d9ca8823 not found: ID does not exist" Dec 16 08:21:32 crc kubenswrapper[4789]: I1216 08:21:32.117105 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" path="/var/lib/kubelet/pods/83824d14-2e2e-42a7-a3d8-dc25657a761b/volumes" Dec 16 08:21:34 crc kubenswrapper[4789]: I1216 08:21:34.105368 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:21:34 crc kubenswrapper[4789]: E1216 08:21:34.106277 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:21:35 crc kubenswrapper[4789]: I1216 08:21:35.534586 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:35 crc kubenswrapper[4789]: I1216 08:21:35.579335 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:35 crc kubenswrapper[4789]: I1216 08:21:35.960531 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ch8sj"] Dec 16 08:21:36 crc kubenswrapper[4789]: I1216 08:21:36.969678 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ch8sj" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="registry-server" containerID="cri-o://b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47" gracePeriod=2 Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.438977 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.568435 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r46xf\" (UniqueName: \"kubernetes.io/projected/806e978e-5828-438e-af87-9ee66f148ebc-kube-api-access-r46xf\") pod \"806e978e-5828-438e-af87-9ee66f148ebc\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.568555 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-utilities\") pod \"806e978e-5828-438e-af87-9ee66f148ebc\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.568648 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-catalog-content\") pod \"806e978e-5828-438e-af87-9ee66f148ebc\" (UID: \"806e978e-5828-438e-af87-9ee66f148ebc\") " Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.569443 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-utilities" (OuterVolumeSpecName: "utilities") pod "806e978e-5828-438e-af87-9ee66f148ebc" (UID: "806e978e-5828-438e-af87-9ee66f148ebc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.576346 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806e978e-5828-438e-af87-9ee66f148ebc-kube-api-access-r46xf" (OuterVolumeSpecName: "kube-api-access-r46xf") pod "806e978e-5828-438e-af87-9ee66f148ebc" (UID: "806e978e-5828-438e-af87-9ee66f148ebc"). InnerVolumeSpecName "kube-api-access-r46xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.622833 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "806e978e-5828-438e-af87-9ee66f148ebc" (UID: "806e978e-5828-438e-af87-9ee66f148ebc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.670837 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r46xf\" (UniqueName: \"kubernetes.io/projected/806e978e-5828-438e-af87-9ee66f148ebc-kube-api-access-r46xf\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.670879 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.670891 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/806e978e-5828-438e-af87-9ee66f148ebc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.979200 4789 generic.go:334] "Generic (PLEG): container finished" podID="806e978e-5828-438e-af87-9ee66f148ebc" containerID="b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47" exitCode=0 Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.979284 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ch8sj" Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.979301 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch8sj" event={"ID":"806e978e-5828-438e-af87-9ee66f148ebc","Type":"ContainerDied","Data":"b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47"} Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.980322 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ch8sj" event={"ID":"806e978e-5828-438e-af87-9ee66f148ebc","Type":"ContainerDied","Data":"7145cf8f4e23bf47d8c6c4dfc5b831b45f411d9c07b7e4c8999d385af24851f1"} Dec 16 08:21:37 crc kubenswrapper[4789]: I1216 08:21:37.980343 4789 scope.go:117] "RemoveContainer" containerID="b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.005235 4789 scope.go:117] "RemoveContainer" containerID="36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.015938 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ch8sj"] Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.025717 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ch8sj"] Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.029510 4789 scope.go:117] "RemoveContainer" containerID="76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.069099 4789 scope.go:117] "RemoveContainer" containerID="b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47" Dec 16 08:21:38 crc kubenswrapper[4789]: E1216 08:21:38.069554 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47\": container with ID starting with b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47 not found: ID does not exist" containerID="b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.069610 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47"} err="failed to get container status \"b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47\": rpc error: code = NotFound desc = could not find container \"b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47\": container with ID starting with b5e4ca5980382e15fb36b7439c8179c13534e40be4966fbea12bcf472a352f47 not found: ID does not exist" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.069632 4789 scope.go:117] "RemoveContainer" containerID="36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114" Dec 16 08:21:38 crc kubenswrapper[4789]: E1216 08:21:38.070207 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114\": container with ID starting with 36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114 not found: ID does not exist" containerID="36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.070259 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114"} err="failed to get container status \"36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114\": rpc error: code = NotFound desc = could not find container \"36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114\": container with ID starting with 36daa15cfe624a4d69fc8e63e6b619e81c9df0b54dddb9abf270e025b17a1114 not found: ID does not exist" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.070289 4789 scope.go:117] "RemoveContainer" containerID="76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f" Dec 16 08:21:38 crc kubenswrapper[4789]: E1216 08:21:38.070726 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f\": container with ID starting with 76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f not found: ID does not exist" containerID="76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.070812 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f"} err="failed to get container status \"76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f\": rpc error: code = NotFound desc = could not find container \"76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f\": container with ID starting with 76c9d58b2f0effdc9d3ef94fc72f1075d496462e22ad27f85aa620b56dda1e8f not found: ID does not exist" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.118199 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="806e978e-5828-438e-af87-9ee66f148ebc" path="/var/lib/kubelet/pods/806e978e-5828-438e-af87-9ee66f148ebc/volumes" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.554270 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:38 crc kubenswrapper[4789]: I1216 08:21:38.599480 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.370343 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5vkf"] Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.370691 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t5vkf" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="registry-server" containerID="cri-o://de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4" gracePeriod=2 Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.781133 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.935882 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/ef91c187-dca6-41ad-9846-57e753d84328-kube-api-access-64f2j\") pod \"ef91c187-dca6-41ad-9846-57e753d84328\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.935972 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-catalog-content\") pod \"ef91c187-dca6-41ad-9846-57e753d84328\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.936107 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-utilities\") pod \"ef91c187-dca6-41ad-9846-57e753d84328\" (UID: \"ef91c187-dca6-41ad-9846-57e753d84328\") " Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.937416 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-utilities" (OuterVolumeSpecName: "utilities") pod "ef91c187-dca6-41ad-9846-57e753d84328" (UID: "ef91c187-dca6-41ad-9846-57e753d84328"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:21:40 crc kubenswrapper[4789]: I1216 08:21:40.941369 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef91c187-dca6-41ad-9846-57e753d84328-kube-api-access-64f2j" (OuterVolumeSpecName: "kube-api-access-64f2j") pod "ef91c187-dca6-41ad-9846-57e753d84328" (UID: "ef91c187-dca6-41ad-9846-57e753d84328"). InnerVolumeSpecName "kube-api-access-64f2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.039145 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.039415 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64f2j\" (UniqueName: \"kubernetes.io/projected/ef91c187-dca6-41ad-9846-57e753d84328-kube-api-access-64f2j\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.047626 4789 generic.go:334] "Generic (PLEG): container finished" podID="ef91c187-dca6-41ad-9846-57e753d84328" containerID="de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4" exitCode=0 Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.047664 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5vkf" event={"ID":"ef91c187-dca6-41ad-9846-57e753d84328","Type":"ContainerDied","Data":"de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4"} Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.047700 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5vkf" event={"ID":"ef91c187-dca6-41ad-9846-57e753d84328","Type":"ContainerDied","Data":"f28ffd5cc5cbad43eeee548ae1f5f4f227ac986d2c4c0640816c2fe6f33b9ec0"} Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.047718 4789 scope.go:117] "RemoveContainer" containerID="de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.047831 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5vkf" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.057024 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-k6fzp"] Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.063694 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef91c187-dca6-41ad-9846-57e753d84328" (UID: "ef91c187-dca6-41ad-9846-57e753d84328"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.067844 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-k6fzp"] Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.069589 4789 scope.go:117] "RemoveContainer" containerID="16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.086889 4789 scope.go:117] "RemoveContainer" containerID="919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.122210 4789 scope.go:117] "RemoveContainer" containerID="de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4" Dec 16 08:21:41 crc kubenswrapper[4789]: E1216 08:21:41.122692 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4\": container with ID starting with de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4 not found: ID does not exist" containerID="de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.122742 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4"} err="failed to get container status \"de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4\": rpc error: code = NotFound desc = could not find container \"de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4\": container with ID starting with de8fbf61bde5037aa31d460379fb0cc48182960c6ce7899051daf5d019fd5cd4 not found: ID does not exist" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.122768 4789 scope.go:117] "RemoveContainer" containerID="16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2" Dec 16 08:21:41 crc kubenswrapper[4789]: E1216 08:21:41.123316 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2\": container with ID starting with 16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2 not found: ID does not exist" containerID="16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.123350 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2"} err="failed to get container status \"16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2\": rpc error: code = NotFound desc = could not find container \"16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2\": container with ID starting with 16b4567cceebeff022fb34d46246297d5c08cb74d4a8affdb69574d8554b1cb2 not found: ID does not exist" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.123369 4789 scope.go:117] "RemoveContainer" containerID="919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0" Dec 16 08:21:41 crc kubenswrapper[4789]: E1216 08:21:41.123642 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0\": container with ID starting with 919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0 not found: ID does not exist" containerID="919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.123670 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0"} err="failed to get container status \"919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0\": rpc error: code = NotFound desc = could not find container \"919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0\": container with ID starting with 919f0ed76de0dd1d6515b379c199ea846f48216ffda646a09e0310152a0263f0 not found: ID does not exist" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.142091 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef91c187-dca6-41ad-9846-57e753d84328-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.399710 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5vkf"] Dec 16 08:21:41 crc kubenswrapper[4789]: I1216 08:21:41.407568 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t5vkf"] Dec 16 08:21:42 crc kubenswrapper[4789]: I1216 08:21:42.116360 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e" path="/var/lib/kubelet/pods/cd4f4d7c-b258-4ad0-92eb-42b86ba4ce2e/volumes" Dec 16 08:21:42 crc kubenswrapper[4789]: I1216 08:21:42.117114 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef91c187-dca6-41ad-9846-57e753d84328" path="/var/lib/kubelet/pods/ef91c187-dca6-41ad-9846-57e753d84328/volumes" Dec 16 08:21:46 crc kubenswrapper[4789]: I1216 08:21:46.106036 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:21:46 crc kubenswrapper[4789]: E1216 08:21:46.106928 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:21:54 crc kubenswrapper[4789]: I1216 08:21:54.056521 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fnzlk"] Dec 16 08:21:54 crc kubenswrapper[4789]: I1216 08:21:54.067990 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fnzlk"] Dec 16 08:21:54 crc kubenswrapper[4789]: I1216 08:21:54.119279 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed5961a3-6529-4866-ba46-acc8e57c0dc1" path="/var/lib/kubelet/pods/ed5961a3-6529-4866-ba46-acc8e57c0dc1/volumes" Dec 16 08:21:57 crc kubenswrapper[4789]: I1216 08:21:57.105513 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:21:57 crc kubenswrapper[4789]: E1216 08:21:57.106085 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:22:12 crc kubenswrapper[4789]: I1216 08:22:12.110825 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:22:12 crc kubenswrapper[4789]: E1216 08:22:12.112893 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:22:25 crc kubenswrapper[4789]: I1216 08:22:25.105493 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:22:25 crc kubenswrapper[4789]: E1216 08:22:25.106301 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:22:30 crc kubenswrapper[4789]: I1216 08:22:30.586158 4789 scope.go:117] "RemoveContainer" containerID="b45e8cdaac36c45fe8479f84d146a4c0bdc4e8d6add454265fd7f249755971f4" Dec 16 08:22:30 crc kubenswrapper[4789]: I1216 08:22:30.622244 4789 scope.go:117] "RemoveContainer" containerID="937f80c64f7219d4b576f08960d8e0c2b347ac2e606386ec32fa85857201ac27" Dec 16 08:22:40 crc kubenswrapper[4789]: I1216 08:22:40.104799 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:22:40 crc kubenswrapper[4789]: E1216 08:22:40.105659 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.701888 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75c9998c75-cdgh8"] Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702781 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="extract-utilities" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702793 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="extract-utilities" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702811 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702816 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702826 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702832 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702844 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="extract-content" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702851 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="extract-content" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702863 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="extract-utilities" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702870 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="extract-utilities" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702884 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="extract-content" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702890 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="extract-content" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702900 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702905 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702934 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="extract-utilities" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702940 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="extract-utilities" Dec 16 08:22:47 crc kubenswrapper[4789]: E1216 08:22:47.702960 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="extract-content" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.702966 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="extract-content" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.703116 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="806e978e-5828-438e-af87-9ee66f148ebc" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.703159 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef91c187-dca6-41ad-9846-57e753d84328" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.703171 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="83824d14-2e2e-42a7-a3d8-dc25657a761b" containerName="registry-server" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.704115 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.706490 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-95mfn" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.706577 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.706492 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.706692 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.721100 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c9998c75-cdgh8"] Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.729317 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-scripts\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.729361 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvnc\" (UniqueName: \"kubernetes.io/projected/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-kube-api-access-xgvnc\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.729444 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-config-data\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.729480 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-horizon-secret-key\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.729537 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-logs\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.761798 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.762059 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-log" containerID="cri-o://68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588" gracePeriod=30 Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.762206 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-httpd" containerID="cri-o://25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f" gracePeriod=30 Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.831866 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-config-data\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.832327 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-horizon-secret-key\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.832404 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-logs\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.832472 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvnc\" (UniqueName: \"kubernetes.io/projected/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-kube-api-access-xgvnc\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.832494 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-scripts\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.833210 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-scripts\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.833345 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-config-data\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.833405 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-logs\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.840454 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-horizon-secret-key\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.843675 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.843882 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-log" containerID="cri-o://67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae" gracePeriod=30 Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.844240 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-httpd" containerID="cri-o://f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1" gracePeriod=30 Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.868146 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvnc\" (UniqueName: \"kubernetes.io/projected/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-kube-api-access-xgvnc\") pod \"horizon-75c9998c75-cdgh8\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.906150 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5687b5b99f-jt7pv"] Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.907653 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:47 crc kubenswrapper[4789]: I1216 08:22:47.918249 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5687b5b99f-jt7pv"] Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.031212 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.041467 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwbsd\" (UniqueName: \"kubernetes.io/projected/3c494572-49a9-431b-88d5-ebaf4c93d86d-kube-api-access-bwbsd\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.041791 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-config-data\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.041891 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c494572-49a9-431b-88d5-ebaf4c93d86d-horizon-secret-key\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.042075 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-scripts\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.042236 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c494572-49a9-431b-88d5-ebaf4c93d86d-logs\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.145118 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-config-data\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.145413 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c494572-49a9-431b-88d5-ebaf4c93d86d-horizon-secret-key\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.145438 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-scripts\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.145484 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c494572-49a9-431b-88d5-ebaf4c93d86d-logs\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.145619 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwbsd\" (UniqueName: \"kubernetes.io/projected/3c494572-49a9-431b-88d5-ebaf4c93d86d-kube-api-access-bwbsd\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.146865 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-config-data\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.147238 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c494572-49a9-431b-88d5-ebaf4c93d86d-logs\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.147327 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-scripts\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.150852 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c494572-49a9-431b-88d5-ebaf4c93d86d-horizon-secret-key\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.164016 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwbsd\" (UniqueName: \"kubernetes.io/projected/3c494572-49a9-431b-88d5-ebaf4c93d86d-kube-api-access-bwbsd\") pod \"horizon-5687b5b99f-jt7pv\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.245599 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5687b5b99f-jt7pv"] Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.246447 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.279388 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f9d64d5cc-h75zb"] Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.283253 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.296414 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f9d64d5cc-h75zb"] Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.452939 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-logs\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.453325 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b44nf\" (UniqueName: \"kubernetes.io/projected/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-kube-api-access-b44nf\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.453404 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-config-data\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.453444 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-scripts\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.453468 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-horizon-secret-key\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.508555 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c9998c75-cdgh8"] Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.555113 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b44nf\" (UniqueName: \"kubernetes.io/projected/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-kube-api-access-b44nf\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.555227 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-config-data\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.555261 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-scripts\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.555290 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-horizon-secret-key\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.555323 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-logs\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.555716 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-logs\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.556127 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-scripts\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.556960 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-config-data\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.559284 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-horizon-secret-key\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.568487 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b44nf\" (UniqueName: \"kubernetes.io/projected/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-kube-api-access-b44nf\") pod \"horizon-f9d64d5cc-h75zb\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.614033 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.741677 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c9998c75-cdgh8" event={"ID":"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d","Type":"ContainerStarted","Data":"c551b38a130ac5f500930e4c9af38a4e2a994fdd69b23b031a5ca9caec648668"} Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.745366 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerID="68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588" exitCode=143 Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.745414 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea","Type":"ContainerDied","Data":"68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588"} Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.746948 4789 generic.go:334] "Generic (PLEG): container finished" podID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerID="67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae" exitCode=143 Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.746973 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3","Type":"ContainerDied","Data":"67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae"} Dec 16 08:22:48 crc kubenswrapper[4789]: I1216 08:22:48.797853 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5687b5b99f-jt7pv"] Dec 16 08:22:49 crc kubenswrapper[4789]: I1216 08:22:49.142636 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f9d64d5cc-h75zb"] Dec 16 08:22:49 crc kubenswrapper[4789]: W1216 08:22:49.151688 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode431d2d0_98ab_42f4_bb32_3fe3aca72c6b.slice/crio-350907d9d13ef348919167611f66208b0c9d2f2fa0da4d38b56d4a8717f89113 WatchSource:0}: Error finding container 350907d9d13ef348919167611f66208b0c9d2f2fa0da4d38b56d4a8717f89113: Status 404 returned error can't find the container with id 350907d9d13ef348919167611f66208b0c9d2f2fa0da4d38b56d4a8717f89113 Dec 16 08:22:49 crc kubenswrapper[4789]: I1216 08:22:49.756940 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d64d5cc-h75zb" event={"ID":"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b","Type":"ContainerStarted","Data":"350907d9d13ef348919167611f66208b0c9d2f2fa0da4d38b56d4a8717f89113"} Dec 16 08:22:49 crc kubenswrapper[4789]: I1216 08:22:49.759525 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5687b5b99f-jt7pv" event={"ID":"3c494572-49a9-431b-88d5-ebaf4c93d86d","Type":"ContainerStarted","Data":"d1a6e9bc4f735f1f58d4e654c681f402f870609d11190bd8bc4271ba531394bf"} Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.530472 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.621156 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-ceph\") pod \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.621245 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-combined-ca-bundle\") pod \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.621343 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8glzg\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-kube-api-access-8glzg\") pod \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.621419 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-logs\") pod \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.621468 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-config-data\") pod \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.621498 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-scripts\") pod \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.621554 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-httpd-run\") pod \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\" (UID: \"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.622620 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" (UID: "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.623038 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-logs" (OuterVolumeSpecName: "logs") pod "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" (UID: "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.629725 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-ceph" (OuterVolumeSpecName: "ceph") pod "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" (UID: "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.629842 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-kube-api-access-8glzg" (OuterVolumeSpecName: "kube-api-access-8glzg") pod "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" (UID: "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea"). InnerVolumeSpecName "kube-api-access-8glzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.631774 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-scripts" (OuterVolumeSpecName: "scripts") pod "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" (UID: "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.656624 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" (UID: "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.663443 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.673771 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-config-data" (OuterVolumeSpecName: "config-data") pod "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" (UID: "a0c581d5-1c7a-455b-a714-2d9cfc6a96ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.723273 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.723304 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.723314 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.723322 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.723332 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8glzg\" (UniqueName: \"kubernetes.io/projected/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-kube-api-access-8glzg\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.723340 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.723347 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.779206 4789 generic.go:334] "Generic (PLEG): container finished" podID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerID="f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1" exitCode=0 Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.779283 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3","Type":"ContainerDied","Data":"f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1"} Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.779317 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3","Type":"ContainerDied","Data":"001346f108f9b66d8b594c1123591540e02bc7bc39c9ef6658e42e7e97968220"} Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.779337 4789 scope.go:117] "RemoveContainer" containerID="f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.779498 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.784632 4789 generic.go:334] "Generic (PLEG): container finished" podID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerID="25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f" exitCode=0 Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.784826 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea","Type":"ContainerDied","Data":"25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f"} Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.784886 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0c581d5-1c7a-455b-a714-2d9cfc6a96ea","Type":"ContainerDied","Data":"5da7599cf296c984fb95ed5a86379a55e662eedc72901c6ed552ba2dc07bdc67"} Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.784943 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.821050 4789 scope.go:117] "RemoveContainer" containerID="67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.826031 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5p5f\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-kube-api-access-b5p5f\") pod \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.826140 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-combined-ca-bundle\") pod \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.826214 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-config-data\") pod \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.826308 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-logs\") pod \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.826340 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-scripts\") pod \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.826411 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-ceph\") pod \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.826463 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-httpd-run\") pod \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\" (UID: \"c74ae8fa-e4f5-4138-9b8c-356ff2345ba3\") " Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.828411 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" (UID: "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.828830 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-logs" (OuterVolumeSpecName: "logs") pod "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" (UID: "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.833809 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-kube-api-access-b5p5f" (OuterVolumeSpecName: "kube-api-access-b5p5f") pod "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" (UID: "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3"). InnerVolumeSpecName "kube-api-access-b5p5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.833883 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.834487 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-scripts" (OuterVolumeSpecName: "scripts") pod "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" (UID: "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.841825 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-ceph" (OuterVolumeSpecName: "ceph") pod "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" (UID: "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.853384 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.861134 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.861520 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-log" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.861539 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-log" Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.861576 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-httpd" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.861585 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-httpd" Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.861602 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-httpd" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.861609 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-httpd" Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.861618 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-log" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.861624 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-log" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.861656 4789 scope.go:117] "RemoveContainer" containerID="f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1" Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.863166 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1\": container with ID starting with f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1 not found: ID does not exist" containerID="f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.863197 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1"} err="failed to get container status \"f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1\": rpc error: code = NotFound desc = could not find container \"f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1\": container with ID starting with f9cac19a8850f553b2e201cf18d1491d676b1af032533c32e6ca6da3c117f4f1 not found: ID does not exist" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.863239 4789 scope.go:117] "RemoveContainer" containerID="67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae" Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.863584 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae\": container with ID starting with 67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae not found: ID does not exist" containerID="67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.863603 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae"} err="failed to get container status \"67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae\": rpc error: code = NotFound desc = could not find container \"67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae\": container with ID starting with 67d0034772d8c33162b94a6af7fb9b8d63323930b83ba74c6462109a1aad76ae not found: ID does not exist" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.863638 4789 scope.go:117] "RemoveContainer" containerID="25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.864099 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-log" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.864294 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-log" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.864323 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" containerName="glance-httpd" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.864396 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" (UID: "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.864374 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" containerName="glance-httpd" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.866260 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.868874 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.879553 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-config-data" (OuterVolumeSpecName: "config-data") pod "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" (UID: "c74ae8fa-e4f5-4138-9b8c-356ff2345ba3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.892285 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.900662 4789 scope.go:117] "RemoveContainer" containerID="68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.930432 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.931511 4789 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.931535 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5p5f\" (UniqueName: \"kubernetes.io/projected/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-kube-api-access-b5p5f\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.931544 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.931555 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.931566 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.931573 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.933750 4789 scope.go:117] "RemoveContainer" containerID="25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f" Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.937717 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f\": container with ID starting with 25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f not found: ID does not exist" containerID="25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.937793 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f"} err="failed to get container status \"25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f\": rpc error: code = NotFound desc = could not find container \"25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f\": container with ID starting with 25dac1f7911ca84b15690e3bda227ca0d568ed6dc0cfa6aa4afed557e4f4b40f not found: ID does not exist" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.937824 4789 scope.go:117] "RemoveContainer" containerID="68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588" Dec 16 08:22:51 crc kubenswrapper[4789]: E1216 08:22:51.938358 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588\": container with ID starting with 68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588 not found: ID does not exist" containerID="68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588" Dec 16 08:22:51 crc kubenswrapper[4789]: I1216 08:22:51.938395 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588"} err="failed to get container status \"68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588\": rpc error: code = NotFound desc = could not find container \"68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588\": container with ID starting with 68fd782283302a0d4c84212d724caebac870e76598a4ab17033aadd26091b588 not found: ID does not exist" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.033626 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-config-data\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.033691 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.033725 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-logs\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.033802 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6x64\" (UniqueName: \"kubernetes.io/projected/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-kube-api-access-q6x64\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.033826 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.033882 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-ceph\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.033934 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-scripts\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.108739 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:22:52 crc kubenswrapper[4789]: E1216 08:22:52.108971 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.138153 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-ceph\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.138227 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-scripts\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.138313 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-config-data\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.138422 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.138456 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-logs\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.138479 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6x64\" (UniqueName: \"kubernetes.io/projected/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-kube-api-access-q6x64\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.138511 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.139056 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.139725 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-logs\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.139790 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c581d5-1c7a-455b-a714-2d9cfc6a96ea" path="/var/lib/kubelet/pods/a0c581d5-1c7a-455b-a714-2d9cfc6a96ea/volumes" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.140702 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.144386 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-ceph\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.153854 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.154975 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.157759 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-config-data\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.172462 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-scripts\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.176383 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.181889 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6x64\" (UniqueName: \"kubernetes.io/projected/314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9-kube-api-access-q6x64\") pod \"glance-default-external-api-0\" (UID: \"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9\") " pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.183284 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.185988 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.193262 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.198617 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.348304 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6177-3aa1-44ed-bfa3-aa69902ad292-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.348569 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6177-3aa1-44ed-bfa3-aa69902ad292-logs\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.348740 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/979e6177-3aa1-44ed-bfa3-aa69902ad292-ceph\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.348904 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqsr\" (UniqueName: \"kubernetes.io/projected/979e6177-3aa1-44ed-bfa3-aa69902ad292-kube-api-access-8nqsr\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.349071 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-config-data\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.349163 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.349273 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-scripts\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.451130 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-scripts\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.451493 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6177-3aa1-44ed-bfa3-aa69902ad292-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.451553 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6177-3aa1-44ed-bfa3-aa69902ad292-logs\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.451622 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/979e6177-3aa1-44ed-bfa3-aa69902ad292-ceph\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.451701 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqsr\" (UniqueName: \"kubernetes.io/projected/979e6177-3aa1-44ed-bfa3-aa69902ad292-kube-api-access-8nqsr\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.452098 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-config-data\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.452043 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6177-3aa1-44ed-bfa3-aa69902ad292-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.452132 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.452181 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6177-3aa1-44ed-bfa3-aa69902ad292-logs\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.457014 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.457099 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-scripts\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.457334 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6177-3aa1-44ed-bfa3-aa69902ad292-config-data\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.463997 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/979e6177-3aa1-44ed-bfa3-aa69902ad292-ceph\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.465602 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqsr\" (UniqueName: \"kubernetes.io/projected/979e6177-3aa1-44ed-bfa3-aa69902ad292-kube-api-access-8nqsr\") pod \"glance-default-internal-api-0\" (UID: \"979e6177-3aa1-44ed-bfa3-aa69902ad292\") " pod="openstack/glance-default-internal-api-0" Dec 16 08:22:52 crc kubenswrapper[4789]: I1216 08:22:52.607845 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 08:22:53 crc kubenswrapper[4789]: I1216 08:22:53.158857 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 08:22:53 crc kubenswrapper[4789]: I1216 08:22:53.566280 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 08:22:53 crc kubenswrapper[4789]: I1216 08:22:53.809048 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979e6177-3aa1-44ed-bfa3-aa69902ad292","Type":"ContainerStarted","Data":"e6ff1ad1a0ab4c3a9a3eded74ce29705bf6255a934b104aefd9a7a7a9a6fb2e0"} Dec 16 08:22:53 crc kubenswrapper[4789]: I1216 08:22:53.809423 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979e6177-3aa1-44ed-bfa3-aa69902ad292","Type":"ContainerStarted","Data":"4d57fc89d15133551d40a9d70721c89911f64c4cc9b779603c0430b0ab93c498"} Dec 16 08:22:54 crc kubenswrapper[4789]: I1216 08:22:54.117053 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74ae8fa-e4f5-4138-9b8c-356ff2345ba3" path="/var/lib/kubelet/pods/c74ae8fa-e4f5-4138-9b8c-356ff2345ba3/volumes" Dec 16 08:22:58 crc kubenswrapper[4789]: I1216 08:22:58.297556 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9","Type":"ContainerStarted","Data":"1949315c91deb9f079f4a36d4eb7f785d58fdacae2c5e9d1dadbc24d1fdd5326"} Dec 16 08:23:01 crc kubenswrapper[4789]: I1216 08:23:01.325711 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9","Type":"ContainerStarted","Data":"8d4c9840f8ed0f4d5159a5f22f74006d3e2cc6666e55b18bef7dfa145d1896ef"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.343967 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9","Type":"ContainerStarted","Data":"3bcda739a81c7c2aded8abc7dbb08ab5f1442dbbc31667a195bca22245fd1aaf"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.354463 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5687b5b99f-jt7pv" event={"ID":"3c494572-49a9-431b-88d5-ebaf4c93d86d","Type":"ContainerStarted","Data":"57aa49623098cd36b57468cd006fdd074ee09e3ca9fc2c11d11c3a2091272cda"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.354512 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5687b5b99f-jt7pv" event={"ID":"3c494572-49a9-431b-88d5-ebaf4c93d86d","Type":"ContainerStarted","Data":"58fe57e00a7fcd23ee288d25d350f91defbb8713a09b1465190316df24a0d7c1"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.354646 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5687b5b99f-jt7pv" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon-log" containerID="cri-o://58fe57e00a7fcd23ee288d25d350f91defbb8713a09b1465190316df24a0d7c1" gracePeriod=30 Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.354999 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5687b5b99f-jt7pv" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon" containerID="cri-o://57aa49623098cd36b57468cd006fdd074ee09e3ca9fc2c11d11c3a2091272cda" gracePeriod=30 Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.362973 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"979e6177-3aa1-44ed-bfa3-aa69902ad292","Type":"ContainerStarted","Data":"c41dc239b94d5f5bf73b8b839809af741e8c241ece2b898a04ffcb075be098ec"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.377029 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c9998c75-cdgh8" event={"ID":"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d","Type":"ContainerStarted","Data":"34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.377074 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c9998c75-cdgh8" event={"ID":"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d","Type":"ContainerStarted","Data":"6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.379394 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.379373418 podStartE2EDuration="11.379373418s" podCreationTimestamp="2025-12-16 08:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:23:02.367094218 +0000 UTC m=+5520.628981847" watchObservedRunningTime="2025-12-16 08:23:02.379373418 +0000 UTC m=+5520.641261047" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.380565 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d64d5cc-h75zb" event={"ID":"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b","Type":"ContainerStarted","Data":"11849b79e5ea201afcfdc834fc7d0612b50e09f3d28214ac641f44c75a5e094a"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.380594 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d64d5cc-h75zb" event={"ID":"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b","Type":"ContainerStarted","Data":"4ef17f0617de7ff1aa5a06845af58c14c66565e017c2f85f43b0d884753e03b4"} Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.409299 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5687b5b99f-jt7pv" podStartSLOduration=2.981068897 podStartE2EDuration="15.409281309s" podCreationTimestamp="2025-12-16 08:22:47 +0000 UTC" firstStartedPulling="2025-12-16 08:22:48.819133723 +0000 UTC m=+5507.081021352" lastFinishedPulling="2025-12-16 08:23:01.247346135 +0000 UTC m=+5519.509233764" observedRunningTime="2025-12-16 08:23:02.406307937 +0000 UTC m=+5520.668195576" watchObservedRunningTime="2025-12-16 08:23:02.409281309 +0000 UTC m=+5520.671168928" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.433122 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.433104192 podStartE2EDuration="10.433104192s" podCreationTimestamp="2025-12-16 08:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:23:02.426465979 +0000 UTC m=+5520.688353618" watchObservedRunningTime="2025-12-16 08:23:02.433104192 +0000 UTC m=+5520.694991821" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.447403 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f9d64d5cc-h75zb" podStartSLOduration=2.393204341 podStartE2EDuration="14.447384301s" podCreationTimestamp="2025-12-16 08:22:48 +0000 UTC" firstStartedPulling="2025-12-16 08:22:49.154404158 +0000 UTC m=+5507.416291787" lastFinishedPulling="2025-12-16 08:23:01.208584108 +0000 UTC m=+5519.470471747" observedRunningTime="2025-12-16 08:23:02.446339515 +0000 UTC m=+5520.708227144" watchObservedRunningTime="2025-12-16 08:23:02.447384301 +0000 UTC m=+5520.709271930" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.476118 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75c9998c75-cdgh8" podStartSLOduration=2.766492172 podStartE2EDuration="15.476096513s" podCreationTimestamp="2025-12-16 08:22:47 +0000 UTC" firstStartedPulling="2025-12-16 08:22:48.508499669 +0000 UTC m=+5506.770387298" lastFinishedPulling="2025-12-16 08:23:01.21810401 +0000 UTC m=+5519.479991639" observedRunningTime="2025-12-16 08:23:02.466231261 +0000 UTC m=+5520.728118900" watchObservedRunningTime="2025-12-16 08:23:02.476096513 +0000 UTC m=+5520.737984132" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.608943 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.609019 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.643016 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:02 crc kubenswrapper[4789]: I1216 08:23:02.652819 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:03 crc kubenswrapper[4789]: I1216 08:23:03.390208 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:03 crc kubenswrapper[4789]: I1216 08:23:03.390257 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:04 crc kubenswrapper[4789]: I1216 08:23:04.105951 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:23:04 crc kubenswrapper[4789]: E1216 08:23:04.106480 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:23:05 crc kubenswrapper[4789]: I1216 08:23:05.382113 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:08 crc kubenswrapper[4789]: I1216 08:23:08.032180 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:23:08 crc kubenswrapper[4789]: I1216 08:23:08.033422 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:23:08 crc kubenswrapper[4789]: I1216 08:23:08.247527 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:23:08 crc kubenswrapper[4789]: I1216 08:23:08.614874 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:23:08 crc kubenswrapper[4789]: I1216 08:23:08.614949 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:23:12 crc kubenswrapper[4789]: I1216 08:23:12.199662 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 08:23:12 crc kubenswrapper[4789]: I1216 08:23:12.199948 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 08:23:12 crc kubenswrapper[4789]: I1216 08:23:12.232682 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 08:23:12 crc kubenswrapper[4789]: I1216 08:23:12.247640 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 08:23:12 crc kubenswrapper[4789]: I1216 08:23:12.471268 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 08:23:12 crc kubenswrapper[4789]: I1216 08:23:12.471310 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 08:23:14 crc kubenswrapper[4789]: I1216 08:23:14.407083 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 08:23:14 crc kubenswrapper[4789]: I1216 08:23:14.423613 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 08:23:18 crc kubenswrapper[4789]: I1216 08:23:18.034372 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75c9998c75-cdgh8" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Dec 16 08:23:18 crc kubenswrapper[4789]: I1216 08:23:18.105572 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:23:18 crc kubenswrapper[4789]: E1216 08:23:18.105824 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:23:18 crc kubenswrapper[4789]: I1216 08:23:18.619123 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f9d64d5cc-h75zb" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Dec 16 08:23:22 crc kubenswrapper[4789]: I1216 08:23:22.614385 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 08:23:29 crc kubenswrapper[4789]: I1216 08:23:29.106099 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:23:29 crc kubenswrapper[4789]: E1216 08:23:29.106874 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:23:29 crc kubenswrapper[4789]: I1216 08:23:29.901533 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:23:30 crc kubenswrapper[4789]: I1216 08:23:30.377975 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:23:31 crc kubenswrapper[4789]: I1216 08:23:31.579454 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.334982 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.429696 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c9998c75-cdgh8"] Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.433567 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75c9998c75-cdgh8" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon-log" containerID="cri-o://6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e" gracePeriod=30 Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.433659 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75c9998c75-cdgh8" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" containerID="cri-o://34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c" gracePeriod=30 Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.656127 4789 generic.go:334] "Generic (PLEG): container finished" podID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerID="57aa49623098cd36b57468cd006fdd074ee09e3ca9fc2c11d11c3a2091272cda" exitCode=137 Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.656168 4789 generic.go:334] "Generic (PLEG): container finished" podID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerID="58fe57e00a7fcd23ee288d25d350f91defbb8713a09b1465190316df24a0d7c1" exitCode=137 Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.656191 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5687b5b99f-jt7pv" event={"ID":"3c494572-49a9-431b-88d5-ebaf4c93d86d","Type":"ContainerDied","Data":"57aa49623098cd36b57468cd006fdd074ee09e3ca9fc2c11d11c3a2091272cda"} Dec 16 08:23:32 crc kubenswrapper[4789]: I1216 08:23:32.656222 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5687b5b99f-jt7pv" event={"ID":"3c494572-49a9-431b-88d5-ebaf4c93d86d","Type":"ContainerDied","Data":"58fe57e00a7fcd23ee288d25d350f91defbb8713a09b1465190316df24a0d7c1"} Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.291734 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.376208 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-scripts\") pod \"3c494572-49a9-431b-88d5-ebaf4c93d86d\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.376282 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c494572-49a9-431b-88d5-ebaf4c93d86d-horizon-secret-key\") pod \"3c494572-49a9-431b-88d5-ebaf4c93d86d\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.376397 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-config-data\") pod \"3c494572-49a9-431b-88d5-ebaf4c93d86d\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.376447 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c494572-49a9-431b-88d5-ebaf4c93d86d-logs\") pod \"3c494572-49a9-431b-88d5-ebaf4c93d86d\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.376541 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwbsd\" (UniqueName: \"kubernetes.io/projected/3c494572-49a9-431b-88d5-ebaf4c93d86d-kube-api-access-bwbsd\") pod \"3c494572-49a9-431b-88d5-ebaf4c93d86d\" (UID: \"3c494572-49a9-431b-88d5-ebaf4c93d86d\") " Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.377817 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c494572-49a9-431b-88d5-ebaf4c93d86d-logs" (OuterVolumeSpecName: "logs") pod "3c494572-49a9-431b-88d5-ebaf4c93d86d" (UID: "3c494572-49a9-431b-88d5-ebaf4c93d86d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.379359 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c494572-49a9-431b-88d5-ebaf4c93d86d-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.382902 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c494572-49a9-431b-88d5-ebaf4c93d86d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3c494572-49a9-431b-88d5-ebaf4c93d86d" (UID: "3c494572-49a9-431b-88d5-ebaf4c93d86d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.383230 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c494572-49a9-431b-88d5-ebaf4c93d86d-kube-api-access-bwbsd" (OuterVolumeSpecName: "kube-api-access-bwbsd") pod "3c494572-49a9-431b-88d5-ebaf4c93d86d" (UID: "3c494572-49a9-431b-88d5-ebaf4c93d86d"). InnerVolumeSpecName "kube-api-access-bwbsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.404299 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-config-data" (OuterVolumeSpecName: "config-data") pod "3c494572-49a9-431b-88d5-ebaf4c93d86d" (UID: "3c494572-49a9-431b-88d5-ebaf4c93d86d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.412726 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-scripts" (OuterVolumeSpecName: "scripts") pod "3c494572-49a9-431b-88d5-ebaf4c93d86d" (UID: "3c494572-49a9-431b-88d5-ebaf4c93d86d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.481002 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.481042 4789 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3c494572-49a9-431b-88d5-ebaf4c93d86d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.481056 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c494572-49a9-431b-88d5-ebaf4c93d86d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.481068 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwbsd\" (UniqueName: \"kubernetes.io/projected/3c494572-49a9-431b-88d5-ebaf4c93d86d-kube-api-access-bwbsd\") on node \"crc\" DevicePath \"\"" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.665584 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5687b5b99f-jt7pv" event={"ID":"3c494572-49a9-431b-88d5-ebaf4c93d86d","Type":"ContainerDied","Data":"d1a6e9bc4f735f1f58d4e654c681f402f870609d11190bd8bc4271ba531394bf"} Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.665650 4789 scope.go:117] "RemoveContainer" containerID="57aa49623098cd36b57468cd006fdd074ee09e3ca9fc2c11d11c3a2091272cda" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.665654 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5687b5b99f-jt7pv" Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.699788 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5687b5b99f-jt7pv"] Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.710471 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5687b5b99f-jt7pv"] Dec 16 08:23:33 crc kubenswrapper[4789]: I1216 08:23:33.817802 4789 scope.go:117] "RemoveContainer" containerID="58fe57e00a7fcd23ee288d25d350f91defbb8713a09b1465190316df24a0d7c1" Dec 16 08:23:34 crc kubenswrapper[4789]: I1216 08:23:34.115494 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" path="/var/lib/kubelet/pods/3c494572-49a9-431b-88d5-ebaf4c93d86d/volumes" Dec 16 08:23:35 crc kubenswrapper[4789]: I1216 08:23:35.685999 4789 generic.go:334] "Generic (PLEG): container finished" podID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerID="34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c" exitCode=0 Dec 16 08:23:35 crc kubenswrapper[4789]: I1216 08:23:35.686042 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c9998c75-cdgh8" event={"ID":"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d","Type":"ContainerDied","Data":"34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c"} Dec 16 08:23:38 crc kubenswrapper[4789]: I1216 08:23:38.033453 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75c9998c75-cdgh8" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Dec 16 08:23:41 crc kubenswrapper[4789]: I1216 08:23:41.104845 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:23:41 crc kubenswrapper[4789]: E1216 08:23:41.105400 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:23:48 crc kubenswrapper[4789]: I1216 08:23:48.033392 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75c9998c75-cdgh8" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Dec 16 08:23:55 crc kubenswrapper[4789]: I1216 08:23:55.106057 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:23:55 crc kubenswrapper[4789]: E1216 08:23:55.106671 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:23:58 crc kubenswrapper[4789]: I1216 08:23:58.032985 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75c9998c75-cdgh8" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.94:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.94:8080: connect: connection refused" Dec 16 08:23:58 crc kubenswrapper[4789]: I1216 08:23:58.033102 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.830232 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.919202 4789 generic.go:334] "Generic (PLEG): container finished" podID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerID="6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e" exitCode=137 Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.919250 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c9998c75-cdgh8" event={"ID":"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d","Type":"ContainerDied","Data":"6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e"} Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.919272 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c9998c75-cdgh8" Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.919293 4789 scope.go:117] "RemoveContainer" containerID="34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c" Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.919280 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c9998c75-cdgh8" event={"ID":"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d","Type":"ContainerDied","Data":"c551b38a130ac5f500930e4c9af38a4e2a994fdd69b23b031a5ca9caec648668"} Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.978639 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-config-data\") pod \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.978745 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgvnc\" (UniqueName: \"kubernetes.io/projected/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-kube-api-access-xgvnc\") pod \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.978799 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-scripts\") pod \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.978821 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-horizon-secret-key\") pod \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.978895 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-logs\") pod \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\" (UID: \"85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d\") " Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.980046 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-logs" (OuterVolumeSpecName: "logs") pod "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" (UID: "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.984443 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" (UID: "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:24:02 crc kubenswrapper[4789]: I1216 08:24:02.985100 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-kube-api-access-xgvnc" (OuterVolumeSpecName: "kube-api-access-xgvnc") pod "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" (UID: "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d"). InnerVolumeSpecName "kube-api-access-xgvnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.006671 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-scripts" (OuterVolumeSpecName: "scripts") pod "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" (UID: "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.008590 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-config-data" (OuterVolumeSpecName: "config-data") pod "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" (UID: "85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.081712 4789 scope.go:117] "RemoveContainer" containerID="6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.081745 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.082032 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgvnc\" (UniqueName: \"kubernetes.io/projected/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-kube-api-access-xgvnc\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.082095 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.082165 4789 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.082231 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.102807 4789 scope.go:117] "RemoveContainer" containerID="34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c" Dec 16 08:24:03 crc kubenswrapper[4789]: E1216 08:24:03.103412 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c\": container with ID starting with 34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c not found: ID does not exist" containerID="34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.103454 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c"} err="failed to get container status \"34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c\": rpc error: code = NotFound desc = could not find container \"34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c\": container with ID starting with 34600bc346d5b9b3fb54b92b2e4987bbd560b8a9927e6c7502fc3741b202f02c not found: ID does not exist" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.103487 4789 scope.go:117] "RemoveContainer" containerID="6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e" Dec 16 08:24:03 crc kubenswrapper[4789]: E1216 08:24:03.103979 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e\": container with ID starting with 6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e not found: ID does not exist" containerID="6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.104038 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e"} err="failed to get container status \"6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e\": rpc error: code = NotFound desc = could not find container \"6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e\": container with ID starting with 6485335808ab7675be8d451c4016b710ec866bc3c465f9b970a28cac2fd1a05e not found: ID does not exist" Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.259776 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c9998c75-cdgh8"] Dec 16 08:24:03 crc kubenswrapper[4789]: I1216 08:24:03.269972 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75c9998c75-cdgh8"] Dec 16 08:24:04 crc kubenswrapper[4789]: I1216 08:24:04.115802 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" path="/var/lib/kubelet/pods/85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d/volumes" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.203473 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58fbf69d97-87vvw"] Dec 16 08:24:05 crc kubenswrapper[4789]: E1216 08:24:05.203901 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.203940 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" Dec 16 08:24:05 crc kubenswrapper[4789]: E1216 08:24:05.203968 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon-log" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.203975 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon-log" Dec 16 08:24:05 crc kubenswrapper[4789]: E1216 08:24:05.203993 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon-log" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.204000 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon-log" Dec 16 08:24:05 crc kubenswrapper[4789]: E1216 08:24:05.204012 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.204020 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.204240 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.204270 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.204278 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fdfcca-fe34-4fc1-9a0f-cf719ba8cb1d" containerName="horizon-log" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.204295 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c494572-49a9-431b-88d5-ebaf4c93d86d" containerName="horizon-log" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.205354 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.216922 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58fbf69d97-87vvw"] Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.325182 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-config-data\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.325326 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdd9l\" (UniqueName: \"kubernetes.io/projected/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-kube-api-access-xdd9l\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.325400 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-horizon-secret-key\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.325446 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-logs\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.325472 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-scripts\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.427427 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-horizon-secret-key\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.427505 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-logs\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.427529 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-scripts\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.427579 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-config-data\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.427668 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdd9l\" (UniqueName: \"kubernetes.io/projected/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-kube-api-access-xdd9l\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.428331 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-scripts\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.429163 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-config-data\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.429214 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-logs\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.433489 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-horizon-secret-key\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.450551 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdd9l\" (UniqueName: \"kubernetes.io/projected/f6dcac86-7cd3-427c-a5a3-24b2d4c02361-kube-api-access-xdd9l\") pod \"horizon-58fbf69d97-87vvw\" (UID: \"f6dcac86-7cd3-427c-a5a3-24b2d4c02361\") " pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.531079 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:05 crc kubenswrapper[4789]: I1216 08:24:05.991166 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58fbf69d97-87vvw"] Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.105552 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:24:06 crc kubenswrapper[4789]: E1216 08:24:06.105844 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.408826 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-n2c9f"] Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.410292 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.417298 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-n2c9f"] Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.516777 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-70bf-account-create-update-rcjlx"] Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.519137 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.521100 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.529146 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-70bf-account-create-update-rcjlx"] Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.556690 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493ed487-62fd-429c-bbef-2e2a28daa9f5-operator-scripts\") pod \"heat-db-create-n2c9f\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.557052 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq6xd\" (UniqueName: \"kubernetes.io/projected/493ed487-62fd-429c-bbef-2e2a28daa9f5-kube-api-access-cq6xd\") pod \"heat-db-create-n2c9f\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.659855 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493ed487-62fd-429c-bbef-2e2a28daa9f5-operator-scripts\") pod \"heat-db-create-n2c9f\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.660275 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snp8\" (UniqueName: \"kubernetes.io/projected/e697d55c-bf66-4e4a-a68d-150bfd848aeb-kube-api-access-7snp8\") pod \"heat-70bf-account-create-update-rcjlx\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.660326 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq6xd\" (UniqueName: \"kubernetes.io/projected/493ed487-62fd-429c-bbef-2e2a28daa9f5-kube-api-access-cq6xd\") pod \"heat-db-create-n2c9f\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.660380 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e697d55c-bf66-4e4a-a68d-150bfd848aeb-operator-scripts\") pod \"heat-70bf-account-create-update-rcjlx\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.660792 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493ed487-62fd-429c-bbef-2e2a28daa9f5-operator-scripts\") pod \"heat-db-create-n2c9f\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.682648 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq6xd\" (UniqueName: \"kubernetes.io/projected/493ed487-62fd-429c-bbef-2e2a28daa9f5-kube-api-access-cq6xd\") pod \"heat-db-create-n2c9f\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.727804 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.762735 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snp8\" (UniqueName: \"kubernetes.io/projected/e697d55c-bf66-4e4a-a68d-150bfd848aeb-kube-api-access-7snp8\") pod \"heat-70bf-account-create-update-rcjlx\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.762797 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e697d55c-bf66-4e4a-a68d-150bfd848aeb-operator-scripts\") pod \"heat-70bf-account-create-update-rcjlx\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.763563 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e697d55c-bf66-4e4a-a68d-150bfd848aeb-operator-scripts\") pod \"heat-70bf-account-create-update-rcjlx\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.782687 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snp8\" (UniqueName: \"kubernetes.io/projected/e697d55c-bf66-4e4a-a68d-150bfd848aeb-kube-api-access-7snp8\") pod \"heat-70bf-account-create-update-rcjlx\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.843268 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.962648 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58fbf69d97-87vvw" event={"ID":"f6dcac86-7cd3-427c-a5a3-24b2d4c02361","Type":"ContainerStarted","Data":"8e9e9308fc168b66b6f19c94dd7753271609764cb5e80390f00e79901f71ad99"} Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.963173 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58fbf69d97-87vvw" event={"ID":"f6dcac86-7cd3-427c-a5a3-24b2d4c02361","Type":"ContainerStarted","Data":"2e06608cb9a91c209f65a04eda9c772bea0479d0960ae5e908a61ea8eaa58af1"} Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.963190 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58fbf69d97-87vvw" event={"ID":"f6dcac86-7cd3-427c-a5a3-24b2d4c02361","Type":"ContainerStarted","Data":"e4621f2c004a722479229d1e978fa6a2831b4553124812d4730b07069ef3bdeb"} Dec 16 08:24:06 crc kubenswrapper[4789]: I1216 08:24:06.998239 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58fbf69d97-87vvw" podStartSLOduration=1.998213973 podStartE2EDuration="1.998213973s" podCreationTimestamp="2025-12-16 08:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:24:06.985426141 +0000 UTC m=+5585.247313770" watchObservedRunningTime="2025-12-16 08:24:06.998213973 +0000 UTC m=+5585.260101602" Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.214045 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-n2c9f"] Dec 16 08:24:07 crc kubenswrapper[4789]: W1216 08:24:07.216000 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493ed487_62fd_429c_bbef_2e2a28daa9f5.slice/crio-a82cd082ab0034530a29e85fa519205f66d046e9f00402583ba4779b73d80cdf WatchSource:0}: Error finding container a82cd082ab0034530a29e85fa519205f66d046e9f00402583ba4779b73d80cdf: Status 404 returned error can't find the container with id a82cd082ab0034530a29e85fa519205f66d046e9f00402583ba4779b73d80cdf Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.351479 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-70bf-account-create-update-rcjlx"] Dec 16 08:24:07 crc kubenswrapper[4789]: W1216 08:24:07.354075 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode697d55c_bf66_4e4a_a68d_150bfd848aeb.slice/crio-abc92c946aba2a79326b7880fca92665f5ba13812073af73318243e3d0baaf8a WatchSource:0}: Error finding container abc92c946aba2a79326b7880fca92665f5ba13812073af73318243e3d0baaf8a: Status 404 returned error can't find the container with id abc92c946aba2a79326b7880fca92665f5ba13812073af73318243e3d0baaf8a Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.972514 4789 generic.go:334] "Generic (PLEG): container finished" podID="e697d55c-bf66-4e4a-a68d-150bfd848aeb" containerID="7c5c96973b48500fdd950a9c9328fabecb981ca9822e3eb43681bc4897e94864" exitCode=0 Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.972787 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-70bf-account-create-update-rcjlx" event={"ID":"e697d55c-bf66-4e4a-a68d-150bfd848aeb","Type":"ContainerDied","Data":"7c5c96973b48500fdd950a9c9328fabecb981ca9822e3eb43681bc4897e94864"} Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.972810 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-70bf-account-create-update-rcjlx" event={"ID":"e697d55c-bf66-4e4a-a68d-150bfd848aeb","Type":"ContainerStarted","Data":"abc92c946aba2a79326b7880fca92665f5ba13812073af73318243e3d0baaf8a"} Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.974851 4789 generic.go:334] "Generic (PLEG): container finished" podID="493ed487-62fd-429c-bbef-2e2a28daa9f5" containerID="523d879fcb458f3047ee7911321cdd465ac7e87b987562f12e3689cec379fdfc" exitCode=0 Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.975009 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-n2c9f" event={"ID":"493ed487-62fd-429c-bbef-2e2a28daa9f5","Type":"ContainerDied","Data":"523d879fcb458f3047ee7911321cdd465ac7e87b987562f12e3689cec379fdfc"} Dec 16 08:24:07 crc kubenswrapper[4789]: I1216 08:24:07.975082 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-n2c9f" event={"ID":"493ed487-62fd-429c-bbef-2e2a28daa9f5","Type":"ContainerStarted","Data":"a82cd082ab0034530a29e85fa519205f66d046e9f00402583ba4779b73d80cdf"} Dec 16 08:24:08 crc kubenswrapper[4789]: I1216 08:24:08.063931 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wb5xw"] Dec 16 08:24:08 crc kubenswrapper[4789]: I1216 08:24:08.082074 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wb5xw"] Dec 16 08:24:08 crc kubenswrapper[4789]: I1216 08:24:08.094264 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-406b-account-create-update-c2wxr"] Dec 16 08:24:08 crc kubenswrapper[4789]: I1216 08:24:08.125679 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f6889d-889d-411f-be17-dcf5d7189a24" path="/var/lib/kubelet/pods/e0f6889d-889d-411f-be17-dcf5d7189a24/volumes" Dec 16 08:24:08 crc kubenswrapper[4789]: I1216 08:24:08.126364 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-406b-account-create-update-c2wxr"] Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.375305 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.382236 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.417533 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq6xd\" (UniqueName: \"kubernetes.io/projected/493ed487-62fd-429c-bbef-2e2a28daa9f5-kube-api-access-cq6xd\") pod \"493ed487-62fd-429c-bbef-2e2a28daa9f5\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.417630 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e697d55c-bf66-4e4a-a68d-150bfd848aeb-operator-scripts\") pod \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.417691 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493ed487-62fd-429c-bbef-2e2a28daa9f5-operator-scripts\") pod \"493ed487-62fd-429c-bbef-2e2a28daa9f5\" (UID: \"493ed487-62fd-429c-bbef-2e2a28daa9f5\") " Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.417852 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snp8\" (UniqueName: \"kubernetes.io/projected/e697d55c-bf66-4e4a-a68d-150bfd848aeb-kube-api-access-7snp8\") pod \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\" (UID: \"e697d55c-bf66-4e4a-a68d-150bfd848aeb\") " Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.418826 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e697d55c-bf66-4e4a-a68d-150bfd848aeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e697d55c-bf66-4e4a-a68d-150bfd848aeb" (UID: "e697d55c-bf66-4e4a-a68d-150bfd848aeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.418992 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493ed487-62fd-429c-bbef-2e2a28daa9f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "493ed487-62fd-429c-bbef-2e2a28daa9f5" (UID: "493ed487-62fd-429c-bbef-2e2a28daa9f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.423443 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493ed487-62fd-429c-bbef-2e2a28daa9f5-kube-api-access-cq6xd" (OuterVolumeSpecName: "kube-api-access-cq6xd") pod "493ed487-62fd-429c-bbef-2e2a28daa9f5" (UID: "493ed487-62fd-429c-bbef-2e2a28daa9f5"). InnerVolumeSpecName "kube-api-access-cq6xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.424743 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e697d55c-bf66-4e4a-a68d-150bfd848aeb-kube-api-access-7snp8" (OuterVolumeSpecName: "kube-api-access-7snp8") pod "e697d55c-bf66-4e4a-a68d-150bfd848aeb" (UID: "e697d55c-bf66-4e4a-a68d-150bfd848aeb"). InnerVolumeSpecName "kube-api-access-7snp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.519484 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493ed487-62fd-429c-bbef-2e2a28daa9f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.519519 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snp8\" (UniqueName: \"kubernetes.io/projected/e697d55c-bf66-4e4a-a68d-150bfd848aeb-kube-api-access-7snp8\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.519530 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq6xd\" (UniqueName: \"kubernetes.io/projected/493ed487-62fd-429c-bbef-2e2a28daa9f5-kube-api-access-cq6xd\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.519538 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e697d55c-bf66-4e4a-a68d-150bfd848aeb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.993445 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-n2c9f" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.993509 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-n2c9f" event={"ID":"493ed487-62fd-429c-bbef-2e2a28daa9f5","Type":"ContainerDied","Data":"a82cd082ab0034530a29e85fa519205f66d046e9f00402583ba4779b73d80cdf"} Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.994504 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a82cd082ab0034530a29e85fa519205f66d046e9f00402583ba4779b73d80cdf" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.995605 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-70bf-account-create-update-rcjlx" event={"ID":"e697d55c-bf66-4e4a-a68d-150bfd848aeb","Type":"ContainerDied","Data":"abc92c946aba2a79326b7880fca92665f5ba13812073af73318243e3d0baaf8a"} Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.995632 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc92c946aba2a79326b7880fca92665f5ba13812073af73318243e3d0baaf8a" Dec 16 08:24:09 crc kubenswrapper[4789]: I1216 08:24:09.995690 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-70bf-account-create-update-rcjlx" Dec 16 08:24:10 crc kubenswrapper[4789]: I1216 08:24:10.117718 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce88088b-1df2-439e-a1c6-9ff81ac4c86d" path="/var/lib/kubelet/pods/ce88088b-1df2-439e-a1c6-9ff81ac4c86d/volumes" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.709183 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-nv87k"] Dec 16 08:24:11 crc kubenswrapper[4789]: E1216 08:24:11.709621 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e697d55c-bf66-4e4a-a68d-150bfd848aeb" containerName="mariadb-account-create-update" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.709632 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e697d55c-bf66-4e4a-a68d-150bfd848aeb" containerName="mariadb-account-create-update" Dec 16 08:24:11 crc kubenswrapper[4789]: E1216 08:24:11.709661 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493ed487-62fd-429c-bbef-2e2a28daa9f5" containerName="mariadb-database-create" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.709667 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="493ed487-62fd-429c-bbef-2e2a28daa9f5" containerName="mariadb-database-create" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.709869 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="493ed487-62fd-429c-bbef-2e2a28daa9f5" containerName="mariadb-database-create" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.709898 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e697d55c-bf66-4e4a-a68d-150bfd848aeb" containerName="mariadb-account-create-update" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.710495 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.712501 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-wpczd" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.712625 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.718003 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-nv87k"] Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.756351 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-config-data\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.756452 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-combined-ca-bundle\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.756510 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswnq\" (UniqueName: \"kubernetes.io/projected/7120d155-aee7-4268-ab9f-f3adc640fb88-kube-api-access-jswnq\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.857852 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jswnq\" (UniqueName: \"kubernetes.io/projected/7120d155-aee7-4268-ab9f-f3adc640fb88-kube-api-access-jswnq\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.858332 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-config-data\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.858418 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-combined-ca-bundle\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.873644 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-combined-ca-bundle\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.873810 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-config-data\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:11 crc kubenswrapper[4789]: I1216 08:24:11.877507 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswnq\" (UniqueName: \"kubernetes.io/projected/7120d155-aee7-4268-ab9f-f3adc640fb88-kube-api-access-jswnq\") pod \"heat-db-sync-nv87k\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:12 crc kubenswrapper[4789]: I1216 08:24:12.039772 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:12 crc kubenswrapper[4789]: I1216 08:24:12.505589 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-nv87k"] Dec 16 08:24:13 crc kubenswrapper[4789]: I1216 08:24:13.065851 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-nv87k" event={"ID":"7120d155-aee7-4268-ab9f-f3adc640fb88","Type":"ContainerStarted","Data":"53e45c9714ca2bc5c14e1e7491adda4ec49db4c21e43bb173e44eaa173669e5d"} Dec 16 08:24:15 crc kubenswrapper[4789]: I1216 08:24:15.531237 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:15 crc kubenswrapper[4789]: I1216 08:24:15.531561 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:18 crc kubenswrapper[4789]: I1216 08:24:18.105666 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:24:18 crc kubenswrapper[4789]: E1216 08:24:18.106369 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.006328 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vrtxq"] Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.021095 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.026386 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrtxq"] Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.191130 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-catalog-content\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.191355 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-utilities\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.192722 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s86z\" (UniqueName: \"kubernetes.io/projected/f3fb8dff-b982-4025-8735-0f955a2887b2-kube-api-access-6s86z\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.294234 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-utilities\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.294305 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s86z\" (UniqueName: \"kubernetes.io/projected/f3fb8dff-b982-4025-8735-0f955a2887b2-kube-api-access-6s86z\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.294348 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-catalog-content\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.296324 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-catalog-content\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.296801 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-utilities\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.319875 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s86z\" (UniqueName: \"kubernetes.io/projected/f3fb8dff-b982-4025-8735-0f955a2887b2-kube-api-access-6s86z\") pod \"certified-operators-vrtxq\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:19 crc kubenswrapper[4789]: I1216 08:24:19.350567 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:21 crc kubenswrapper[4789]: I1216 08:24:21.053549 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-96q8n"] Dec 16 08:24:21 crc kubenswrapper[4789]: I1216 08:24:21.063732 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-96q8n"] Dec 16 08:24:21 crc kubenswrapper[4789]: I1216 08:24:21.723238 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrtxq"] Dec 16 08:24:21 crc kubenswrapper[4789]: W1216 08:24:21.724873 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3fb8dff_b982_4025_8735_0f955a2887b2.slice/crio-a0ed49e4b29f401fb898c9948e2479564e9e2677e10c7356b06232cd5aaf24e3 WatchSource:0}: Error finding container a0ed49e4b29f401fb898c9948e2479564e9e2677e10c7356b06232cd5aaf24e3: Status 404 returned error can't find the container with id a0ed49e4b29f401fb898c9948e2479564e9e2677e10c7356b06232cd5aaf24e3 Dec 16 08:24:22 crc kubenswrapper[4789]: I1216 08:24:22.117186 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3f7fc5-bcab-4dd2-a144-a075499d6a12" path="/var/lib/kubelet/pods/6a3f7fc5-bcab-4dd2-a144-a075499d6a12/volumes" Dec 16 08:24:22 crc kubenswrapper[4789]: I1216 08:24:22.157440 4789 generic.go:334] "Generic (PLEG): container finished" podID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerID="bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394" exitCode=0 Dec 16 08:24:22 crc kubenswrapper[4789]: I1216 08:24:22.157560 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrtxq" event={"ID":"f3fb8dff-b982-4025-8735-0f955a2887b2","Type":"ContainerDied","Data":"bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394"} Dec 16 08:24:22 crc kubenswrapper[4789]: I1216 08:24:22.157838 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrtxq" event={"ID":"f3fb8dff-b982-4025-8735-0f955a2887b2","Type":"ContainerStarted","Data":"a0ed49e4b29f401fb898c9948e2479564e9e2677e10c7356b06232cd5aaf24e3"} Dec 16 08:24:22 crc kubenswrapper[4789]: I1216 08:24:22.165098 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-nv87k" event={"ID":"7120d155-aee7-4268-ab9f-f3adc640fb88","Type":"ContainerStarted","Data":"935a814bb7c56e1f0690250e92a1bc1c636355763b4dec39b6531540696220d8"} Dec 16 08:24:22 crc kubenswrapper[4789]: I1216 08:24:22.202432 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-nv87k" podStartSLOduration=2.418471409 podStartE2EDuration="11.202414346s" podCreationTimestamp="2025-12-16 08:24:11 +0000 UTC" firstStartedPulling="2025-12-16 08:24:12.514208494 +0000 UTC m=+5590.776096123" lastFinishedPulling="2025-12-16 08:24:21.298151431 +0000 UTC m=+5599.560039060" observedRunningTime="2025-12-16 08:24:22.195273641 +0000 UTC m=+5600.457161270" watchObservedRunningTime="2025-12-16 08:24:22.202414346 +0000 UTC m=+5600.464301975" Dec 16 08:24:23 crc kubenswrapper[4789]: I1216 08:24:23.182879 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrtxq" event={"ID":"f3fb8dff-b982-4025-8735-0f955a2887b2","Type":"ContainerStarted","Data":"467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad"} Dec 16 08:24:24 crc kubenswrapper[4789]: I1216 08:24:24.191538 4789 generic.go:334] "Generic (PLEG): container finished" podID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerID="467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad" exitCode=0 Dec 16 08:24:24 crc kubenswrapper[4789]: I1216 08:24:24.191579 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrtxq" event={"ID":"f3fb8dff-b982-4025-8735-0f955a2887b2","Type":"ContainerDied","Data":"467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad"} Dec 16 08:24:25 crc kubenswrapper[4789]: I1216 08:24:25.210400 4789 generic.go:334] "Generic (PLEG): container finished" podID="7120d155-aee7-4268-ab9f-f3adc640fb88" containerID="935a814bb7c56e1f0690250e92a1bc1c636355763b4dec39b6531540696220d8" exitCode=0 Dec 16 08:24:25 crc kubenswrapper[4789]: I1216 08:24:25.210898 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-nv87k" event={"ID":"7120d155-aee7-4268-ab9f-f3adc640fb88","Type":"ContainerDied","Data":"935a814bb7c56e1f0690250e92a1bc1c636355763b4dec39b6531540696220d8"} Dec 16 08:24:25 crc kubenswrapper[4789]: I1216 08:24:25.236343 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrtxq" event={"ID":"f3fb8dff-b982-4025-8735-0f955a2887b2","Type":"ContainerStarted","Data":"17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944"} Dec 16 08:24:25 crc kubenswrapper[4789]: I1216 08:24:25.261864 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vrtxq" podStartSLOduration=4.68282748 podStartE2EDuration="7.261848995s" podCreationTimestamp="2025-12-16 08:24:18 +0000 UTC" firstStartedPulling="2025-12-16 08:24:22.159820385 +0000 UTC m=+5600.421708014" lastFinishedPulling="2025-12-16 08:24:24.7388419 +0000 UTC m=+5603.000729529" observedRunningTime="2025-12-16 08:24:25.257030538 +0000 UTC m=+5603.518918177" watchObservedRunningTime="2025-12-16 08:24:25.261848995 +0000 UTC m=+5603.523736624" Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.583148 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.751335 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-combined-ca-bundle\") pod \"7120d155-aee7-4268-ab9f-f3adc640fb88\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.751416 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jswnq\" (UniqueName: \"kubernetes.io/projected/7120d155-aee7-4268-ab9f-f3adc640fb88-kube-api-access-jswnq\") pod \"7120d155-aee7-4268-ab9f-f3adc640fb88\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.751455 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-config-data\") pod \"7120d155-aee7-4268-ab9f-f3adc640fb88\" (UID: \"7120d155-aee7-4268-ab9f-f3adc640fb88\") " Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.756489 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7120d155-aee7-4268-ab9f-f3adc640fb88-kube-api-access-jswnq" (OuterVolumeSpecName: "kube-api-access-jswnq") pod "7120d155-aee7-4268-ab9f-f3adc640fb88" (UID: "7120d155-aee7-4268-ab9f-f3adc640fb88"). InnerVolumeSpecName "kube-api-access-jswnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.793268 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7120d155-aee7-4268-ab9f-f3adc640fb88" (UID: "7120d155-aee7-4268-ab9f-f3adc640fb88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.826581 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-config-data" (OuterVolumeSpecName: "config-data") pod "7120d155-aee7-4268-ab9f-f3adc640fb88" (UID: "7120d155-aee7-4268-ab9f-f3adc640fb88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.855401 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.855437 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7120d155-aee7-4268-ab9f-f3adc640fb88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:26 crc kubenswrapper[4789]: I1216 08:24:26.855454 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jswnq\" (UniqueName: \"kubernetes.io/projected/7120d155-aee7-4268-ab9f-f3adc640fb88-kube-api-access-jswnq\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:27 crc kubenswrapper[4789]: I1216 08:24:27.255981 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-nv87k" event={"ID":"7120d155-aee7-4268-ab9f-f3adc640fb88","Type":"ContainerDied","Data":"53e45c9714ca2bc5c14e1e7491adda4ec49db4c21e43bb173e44eaa173669e5d"} Dec 16 08:24:27 crc kubenswrapper[4789]: I1216 08:24:27.256035 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53e45c9714ca2bc5c14e1e7491adda4ec49db4c21e43bb173e44eaa173669e5d" Dec 16 08:24:27 crc kubenswrapper[4789]: I1216 08:24:27.256049 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-nv87k" Dec 16 08:24:27 crc kubenswrapper[4789]: I1216 08:24:27.340221 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.270545 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-754bb6b78c-jqn25"] Dec 16 08:24:28 crc kubenswrapper[4789]: E1216 08:24:28.271542 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7120d155-aee7-4268-ab9f-f3adc640fb88" containerName="heat-db-sync" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.271560 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7120d155-aee7-4268-ab9f-f3adc640fb88" containerName="heat-db-sync" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.271769 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7120d155-aee7-4268-ab9f-f3adc640fb88" containerName="heat-db-sync" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.272656 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.277666 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-wpczd" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.277822 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.277822 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.281194 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-754bb6b78c-jqn25"] Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.389438 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-combined-ca-bundle\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.389532 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-config-data-custom\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.389557 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-config-data\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.389733 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kdql\" (UniqueName: \"kubernetes.io/projected/c203e891-92ba-4644-8138-b8375640c961-kube-api-access-2kdql\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.413064 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-c6745d44c-ww76k"] Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.414745 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.417593 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.445570 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c6745d44c-ww76k"] Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.487995 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5b79b95f86-txdfg"] Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.490155 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.493239 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.505025 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kdql\" (UniqueName: \"kubernetes.io/projected/c203e891-92ba-4644-8138-b8375640c961-kube-api-access-2kdql\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.513205 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b79b95f86-txdfg"] Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.520015 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-combined-ca-bundle\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.520260 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-config-data-custom\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.520863 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-config-data\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.546001 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-config-data\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.547277 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-combined-ca-bundle\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.553728 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c203e891-92ba-4644-8138-b8375640c961-config-data-custom\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.562461 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kdql\" (UniqueName: \"kubernetes.io/projected/c203e891-92ba-4644-8138-b8375640c961-kube-api-access-2kdql\") pod \"heat-engine-754bb6b78c-jqn25\" (UID: \"c203e891-92ba-4644-8138-b8375640c961\") " pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.595390 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649447 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-config-data-custom\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649576 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-combined-ca-bundle\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649664 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdkn\" (UniqueName: \"kubernetes.io/projected/ac197a7d-175c-4aec-b5cf-cfa32de39925-kube-api-access-mpdkn\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649695 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-config-data-custom\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649774 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-config-data\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649821 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76bwh\" (UniqueName: \"kubernetes.io/projected/8d1b396a-0632-4f6c-9668-dd2cb3038923-kube-api-access-76bwh\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649844 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-config-data\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.649992 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-combined-ca-bundle\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753602 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-config-data\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753655 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76bwh\" (UniqueName: \"kubernetes.io/projected/8d1b396a-0632-4f6c-9668-dd2cb3038923-kube-api-access-76bwh\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-config-data\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753724 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-combined-ca-bundle\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753769 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-config-data-custom\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753826 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-combined-ca-bundle\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753877 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdkn\" (UniqueName: \"kubernetes.io/projected/ac197a7d-175c-4aec-b5cf-cfa32de39925-kube-api-access-mpdkn\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.753940 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-config-data-custom\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.771546 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-config-data-custom\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.772410 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-config-data\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.774130 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-config-data\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.779100 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-combined-ca-bundle\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.782919 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac197a7d-175c-4aec-b5cf-cfa32de39925-config-data-custom\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.785143 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdkn\" (UniqueName: \"kubernetes.io/projected/ac197a7d-175c-4aec-b5cf-cfa32de39925-kube-api-access-mpdkn\") pod \"heat-api-5b79b95f86-txdfg\" (UID: \"ac197a7d-175c-4aec-b5cf-cfa32de39925\") " pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.786366 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76bwh\" (UniqueName: \"kubernetes.io/projected/8d1b396a-0632-4f6c-9668-dd2cb3038923-kube-api-access-76bwh\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:28 crc kubenswrapper[4789]: I1216 08:24:28.815053 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1b396a-0632-4f6c-9668-dd2cb3038923-combined-ca-bundle\") pod \"heat-cfnapi-c6745d44c-ww76k\" (UID: \"8d1b396a-0632-4f6c-9668-dd2cb3038923\") " pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.024018 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.045810 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.105512 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.181540 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-754bb6b78c-jqn25"] Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.250357 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-58fbf69d97-87vvw" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.302404 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-754bb6b78c-jqn25" event={"ID":"c203e891-92ba-4644-8138-b8375640c961","Type":"ContainerStarted","Data":"594076c8ce23e97dbdd8bd4523adbd8a2c9499de1669a1497599863b7b7ede50"} Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.314384 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f9d64d5cc-h75zb"] Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.314635 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f9d64d5cc-h75zb" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon-log" containerID="cri-o://4ef17f0617de7ff1aa5a06845af58c14c66565e017c2f85f43b0d884753e03b4" gracePeriod=30 Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.315131 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f9d64d5cc-h75zb" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" containerID="cri-o://11849b79e5ea201afcfdc834fc7d0612b50e09f3d28214ac641f44c75a5e094a" gracePeriod=30 Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.353042 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.353353 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.419698 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.587889 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b79b95f86-txdfg"] Dec 16 08:24:29 crc kubenswrapper[4789]: I1216 08:24:29.695777 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c6745d44c-ww76k"] Dec 16 08:24:29 crc kubenswrapper[4789]: W1216 08:24:29.701183 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1b396a_0632_4f6c_9668_dd2cb3038923.slice/crio-ebe37a60e98280971e8a1260dba3477057dca733b0d70129613a255a7eabe326 WatchSource:0}: Error finding container ebe37a60e98280971e8a1260dba3477057dca733b0d70129613a255a7eabe326: Status 404 returned error can't find the container with id ebe37a60e98280971e8a1260dba3477057dca733b0d70129613a255a7eabe326 Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.319003 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-754bb6b78c-jqn25" event={"ID":"c203e891-92ba-4644-8138-b8375640c961","Type":"ContainerStarted","Data":"777befa87815f642590914dae81931de79391d7c9cfddc690f872e20eb81ced5"} Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.319455 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.330853 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c6745d44c-ww76k" event={"ID":"8d1b396a-0632-4f6c-9668-dd2cb3038923","Type":"ContainerStarted","Data":"ebe37a60e98280971e8a1260dba3477057dca733b0d70129613a255a7eabe326"} Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.339536 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-754bb6b78c-jqn25" podStartSLOduration=2.339521391 podStartE2EDuration="2.339521391s" podCreationTimestamp="2025-12-16 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:24:30.335246426 +0000 UTC m=+5608.597134055" watchObservedRunningTime="2025-12-16 08:24:30.339521391 +0000 UTC m=+5608.601409010" Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.341952 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b79b95f86-txdfg" event={"ID":"ac197a7d-175c-4aec-b5cf-cfa32de39925","Type":"ContainerStarted","Data":"4b4a5545bfd97526d23586df7fbf9b1eede7e6831bd2b506aaf1a6392d2fa395"} Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.345802 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"93694727a36708bd1006bf75a00f6ac1c8b551c001d91ed3b60ce8e5c8ebae39"} Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.425940 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.480201 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrtxq"] Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.800447 4789 scope.go:117] "RemoveContainer" containerID="df31c7050a286d36f3cb0e1122242b4802ed650e327459774199353e29e7c125" Dec 16 08:24:30 crc kubenswrapper[4789]: I1216 08:24:30.838312 4789 scope.go:117] "RemoveContainer" containerID="a6b75627db7c0e28090161e5d7c0c6b8e3a8c0bf761d9d4bd0529b47f75cf842" Dec 16 08:24:31 crc kubenswrapper[4789]: I1216 08:24:31.248880 4789 scope.go:117] "RemoveContainer" containerID="eb7aaff6096b1844d408960c85089bfbbcdea51e4a75c3300eba39f2b5bfed56" Dec 16 08:24:32 crc kubenswrapper[4789]: I1216 08:24:32.373048 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c6745d44c-ww76k" event={"ID":"8d1b396a-0632-4f6c-9668-dd2cb3038923","Type":"ContainerStarted","Data":"ba397830ea5f6ca93b23cd4e16c95b96fd94f09b64540f1aa1c8775419f2bce7"} Dec 16 08:24:32 crc kubenswrapper[4789]: I1216 08:24:32.373812 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:32 crc kubenswrapper[4789]: I1216 08:24:32.376503 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b79b95f86-txdfg" event={"ID":"ac197a7d-175c-4aec-b5cf-cfa32de39925","Type":"ContainerStarted","Data":"c3168d3ff6dba53cc97df3b3b77ca0bbfd345462494112ce5b50c3fa6f9ec934"} Dec 16 08:24:32 crc kubenswrapper[4789]: I1216 08:24:32.376462 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vrtxq" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="registry-server" containerID="cri-o://17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944" gracePeriod=2 Dec 16 08:24:32 crc kubenswrapper[4789]: I1216 08:24:32.377502 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:32 crc kubenswrapper[4789]: I1216 08:24:32.400015 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-c6745d44c-ww76k" podStartSLOduration=2.790359621 podStartE2EDuration="4.398897563s" podCreationTimestamp="2025-12-16 08:24:28 +0000 UTC" firstStartedPulling="2025-12-16 08:24:29.703231086 +0000 UTC m=+5607.965118705" lastFinishedPulling="2025-12-16 08:24:31.311769018 +0000 UTC m=+5609.573656647" observedRunningTime="2025-12-16 08:24:32.392617469 +0000 UTC m=+5610.654505098" watchObservedRunningTime="2025-12-16 08:24:32.398897563 +0000 UTC m=+5610.660785192" Dec 16 08:24:32 crc kubenswrapper[4789]: I1216 08:24:32.411859 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5b79b95f86-txdfg" podStartSLOduration=2.705256771 podStartE2EDuration="4.411838549s" podCreationTimestamp="2025-12-16 08:24:28 +0000 UTC" firstStartedPulling="2025-12-16 08:24:29.579590034 +0000 UTC m=+5607.841477663" lastFinishedPulling="2025-12-16 08:24:31.286171812 +0000 UTC m=+5609.548059441" observedRunningTime="2025-12-16 08:24:32.411587913 +0000 UTC m=+5610.673475542" watchObservedRunningTime="2025-12-16 08:24:32.411838549 +0000 UTC m=+5610.673726198" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.371313 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.388642 4789 generic.go:334] "Generic (PLEG): container finished" podID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerID="11849b79e5ea201afcfdc834fc7d0612b50e09f3d28214ac641f44c75a5e094a" exitCode=0 Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.388725 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d64d5cc-h75zb" event={"ID":"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b","Type":"ContainerDied","Data":"11849b79e5ea201afcfdc834fc7d0612b50e09f3d28214ac641f44c75a5e094a"} Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.393604 4789 generic.go:334] "Generic (PLEG): container finished" podID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerID="17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944" exitCode=0 Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.393710 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrtxq" event={"ID":"f3fb8dff-b982-4025-8735-0f955a2887b2","Type":"ContainerDied","Data":"17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944"} Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.393754 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrtxq" event={"ID":"f3fb8dff-b982-4025-8735-0f955a2887b2","Type":"ContainerDied","Data":"a0ed49e4b29f401fb898c9948e2479564e9e2677e10c7356b06232cd5aaf24e3"} Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.393775 4789 scope.go:117] "RemoveContainer" containerID="17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.393887 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrtxq" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.416012 4789 scope.go:117] "RemoveContainer" containerID="467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.439452 4789 scope.go:117] "RemoveContainer" containerID="bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.465370 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-utilities\") pod \"f3fb8dff-b982-4025-8735-0f955a2887b2\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.466035 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-utilities" (OuterVolumeSpecName: "utilities") pod "f3fb8dff-b982-4025-8735-0f955a2887b2" (UID: "f3fb8dff-b982-4025-8735-0f955a2887b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.466197 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s86z\" (UniqueName: \"kubernetes.io/projected/f3fb8dff-b982-4025-8735-0f955a2887b2-kube-api-access-6s86z\") pod \"f3fb8dff-b982-4025-8735-0f955a2887b2\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.466437 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-catalog-content\") pod \"f3fb8dff-b982-4025-8735-0f955a2887b2\" (UID: \"f3fb8dff-b982-4025-8735-0f955a2887b2\") " Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.473235 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3fb8dff-b982-4025-8735-0f955a2887b2-kube-api-access-6s86z" (OuterVolumeSpecName: "kube-api-access-6s86z") pod "f3fb8dff-b982-4025-8735-0f955a2887b2" (UID: "f3fb8dff-b982-4025-8735-0f955a2887b2"). InnerVolumeSpecName "kube-api-access-6s86z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.475715 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.477136 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s86z\" (UniqueName: \"kubernetes.io/projected/f3fb8dff-b982-4025-8735-0f955a2887b2-kube-api-access-6s86z\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.477242 4789 scope.go:117] "RemoveContainer" containerID="17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944" Dec 16 08:24:33 crc kubenswrapper[4789]: E1216 08:24:33.478317 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944\": container with ID starting with 17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944 not found: ID does not exist" containerID="17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.478353 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944"} err="failed to get container status \"17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944\": rpc error: code = NotFound desc = could not find container \"17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944\": container with ID starting with 17072ce6b07b8a9b7959fe09d589f424a5b9a85b03bfef3c0c6a4f9f96b68944 not found: ID does not exist" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.478373 4789 scope.go:117] "RemoveContainer" containerID="467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad" Dec 16 08:24:33 crc kubenswrapper[4789]: E1216 08:24:33.480190 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad\": container with ID starting with 467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad not found: ID does not exist" containerID="467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.480293 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad"} err="failed to get container status \"467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad\": rpc error: code = NotFound desc = could not find container \"467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad\": container with ID starting with 467a82f1e6a6d4c10b76f3efa61322cf9aa1fc4afbc7c61ecdb318e4f23973ad not found: ID does not exist" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.480323 4789 scope.go:117] "RemoveContainer" containerID="bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394" Dec 16 08:24:33 crc kubenswrapper[4789]: E1216 08:24:33.480677 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394\": container with ID starting with bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394 not found: ID does not exist" containerID="bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.480708 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394"} err="failed to get container status \"bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394\": rpc error: code = NotFound desc = could not find container \"bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394\": container with ID starting with bac02433a3b2fa6102053700e9effb82ff048b4bbd6f47b7cec46d45600e7394 not found: ID does not exist" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.518254 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3fb8dff-b982-4025-8735-0f955a2887b2" (UID: "f3fb8dff-b982-4025-8735-0f955a2887b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.578663 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb8dff-b982-4025-8735-0f955a2887b2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.733449 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrtxq"] Dec 16 08:24:33 crc kubenswrapper[4789]: I1216 08:24:33.743517 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vrtxq"] Dec 16 08:24:34 crc kubenswrapper[4789]: I1216 08:24:34.118464 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" path="/var/lib/kubelet/pods/f3fb8dff-b982-4025-8735-0f955a2887b2/volumes" Dec 16 08:24:38 crc kubenswrapper[4789]: I1216 08:24:38.614862 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f9d64d5cc-h75zb" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Dec 16 08:24:40 crc kubenswrapper[4789]: I1216 08:24:40.368770 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-c6745d44c-ww76k" Dec 16 08:24:40 crc kubenswrapper[4789]: I1216 08:24:40.422588 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5b79b95f86-txdfg" Dec 16 08:24:45 crc kubenswrapper[4789]: I1216 08:24:45.058990 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7c8qf"] Dec 16 08:24:45 crc kubenswrapper[4789]: I1216 08:24:45.071067 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7c8qf"] Dec 16 08:24:46 crc kubenswrapper[4789]: I1216 08:24:46.029222 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3b71-account-create-update-6shm9"] Dec 16 08:24:46 crc kubenswrapper[4789]: I1216 08:24:46.040516 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3b71-account-create-update-6shm9"] Dec 16 08:24:46 crc kubenswrapper[4789]: I1216 08:24:46.118712 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2263c18-9034-41a3-a1c1-9833bda12fa3" path="/var/lib/kubelet/pods/a2263c18-9034-41a3-a1c1-9833bda12fa3/volumes" Dec 16 08:24:46 crc kubenswrapper[4789]: I1216 08:24:46.119403 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e847b977-90db-4d0a-91eb-f38fa7cd9035" path="/var/lib/kubelet/pods/e847b977-90db-4d0a-91eb-f38fa7cd9035/volumes" Dec 16 08:24:48 crc kubenswrapper[4789]: I1216 08:24:48.615428 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f9d64d5cc-h75zb" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Dec 16 08:24:48 crc kubenswrapper[4789]: I1216 08:24:48.632065 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-754bb6b78c-jqn25" Dec 16 08:24:55 crc kubenswrapper[4789]: I1216 08:24:55.040837 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zfv7d"] Dec 16 08:24:55 crc kubenswrapper[4789]: I1216 08:24:55.054628 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zfv7d"] Dec 16 08:24:56 crc kubenswrapper[4789]: I1216 08:24:56.118418 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca73e6e-b52b-44af-821f-96e2e7be8bf3" path="/var/lib/kubelet/pods/cca73e6e-b52b-44af-821f-96e2e7be8bf3/volumes" Dec 16 08:24:58 crc kubenswrapper[4789]: I1216 08:24:58.615460 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-f9d64d5cc-h75zb" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Dec 16 08:24:58 crc kubenswrapper[4789]: I1216 08:24:58.616124 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.623896 4789 generic.go:334] "Generic (PLEG): container finished" podID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerID="4ef17f0617de7ff1aa5a06845af58c14c66565e017c2f85f43b0d884753e03b4" exitCode=137 Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.624196 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d64d5cc-h75zb" event={"ID":"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b","Type":"ContainerDied","Data":"4ef17f0617de7ff1aa5a06845af58c14c66565e017c2f85f43b0d884753e03b4"} Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.666766 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt"] Dec 16 08:24:59 crc kubenswrapper[4789]: E1216 08:24:59.669051 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="extract-content" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.669081 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="extract-content" Dec 16 08:24:59 crc kubenswrapper[4789]: E1216 08:24:59.669098 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="extract-utilities" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.669105 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="extract-utilities" Dec 16 08:24:59 crc kubenswrapper[4789]: E1216 08:24:59.669128 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="registry-server" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.669135 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="registry-server" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.671480 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3fb8dff-b982-4025-8735-0f955a2887b2" containerName="registry-server" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.673932 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.676038 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.678542 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt"] Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.780594 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.812998 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.813220 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.813358 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtg6x\" (UniqueName: \"kubernetes.io/projected/27e0476c-3b9a-4129-8376-55b976dbcadc-kube-api-access-wtg6x\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.914488 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b44nf\" (UniqueName: \"kubernetes.io/projected/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-kube-api-access-b44nf\") pod \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.914634 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-horizon-secret-key\") pod \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.914744 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-scripts\") pod \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.914771 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-logs\") pod \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.914811 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-config-data\") pod \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\" (UID: \"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b\") " Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.915205 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.915301 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-logs" (OuterVolumeSpecName: "logs") pod "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" (UID: "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.915615 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.915733 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.916054 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.916138 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtg6x\" (UniqueName: \"kubernetes.io/projected/27e0476c-3b9a-4129-8376-55b976dbcadc-kube-api-access-wtg6x\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.916611 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.929169 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" (UID: "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.933866 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtg6x\" (UniqueName: \"kubernetes.io/projected/27e0476c-3b9a-4129-8376-55b976dbcadc-kube-api-access-wtg6x\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.935814 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-kube-api-access-b44nf" (OuterVolumeSpecName: "kube-api-access-b44nf") pod "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" (UID: "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b"). InnerVolumeSpecName "kube-api-access-b44nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.937552 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-scripts" (OuterVolumeSpecName: "scripts") pod "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" (UID: "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:24:59 crc kubenswrapper[4789]: I1216 08:24:59.937670 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-config-data" (OuterVolumeSpecName: "config-data") pod "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" (UID: "e431d2d0-98ab-42f4-bb32-3fe3aca72c6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.018171 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.018207 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b44nf\" (UniqueName: \"kubernetes.io/projected/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-kube-api-access-b44nf\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.018221 4789 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.018232 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.072691 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.511730 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt"] Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.633826 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d64d5cc-h75zb" event={"ID":"e431d2d0-98ab-42f4-bb32-3fe3aca72c6b","Type":"ContainerDied","Data":"350907d9d13ef348919167611f66208b0c9d2f2fa0da4d38b56d4a8717f89113"} Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.633864 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f9d64d5cc-h75zb" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.633922 4789 scope.go:117] "RemoveContainer" containerID="11849b79e5ea201afcfdc834fc7d0612b50e09f3d28214ac641f44c75a5e094a" Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.636518 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" event={"ID":"27e0476c-3b9a-4129-8376-55b976dbcadc","Type":"ContainerStarted","Data":"6b8971c8589d752d045b61a2b6db1e6dc1412e60725d5c00ca1b85475cd371e0"} Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.665768 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f9d64d5cc-h75zb"] Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.676813 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f9d64d5cc-h75zb"] Dec 16 08:25:00 crc kubenswrapper[4789]: I1216 08:25:00.790224 4789 scope.go:117] "RemoveContainer" containerID="4ef17f0617de7ff1aa5a06845af58c14c66565e017c2f85f43b0d884753e03b4" Dec 16 08:25:01 crc kubenswrapper[4789]: I1216 08:25:01.646399 4789 generic.go:334] "Generic (PLEG): container finished" podID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerID="a1e79b27ed17cc145530b58af1de1d27463dfaa24ed8e3c69ee4b5a8b3a399d7" exitCode=0 Dec 16 08:25:01 crc kubenswrapper[4789]: I1216 08:25:01.646436 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" event={"ID":"27e0476c-3b9a-4129-8376-55b976dbcadc","Type":"ContainerDied","Data":"a1e79b27ed17cc145530b58af1de1d27463dfaa24ed8e3c69ee4b5a8b3a399d7"} Dec 16 08:25:02 crc kubenswrapper[4789]: I1216 08:25:02.116018 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" path="/var/lib/kubelet/pods/e431d2d0-98ab-42f4-bb32-3fe3aca72c6b/volumes" Dec 16 08:25:04 crc kubenswrapper[4789]: I1216 08:25:04.691798 4789 generic.go:334] "Generic (PLEG): container finished" podID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerID="e541a405f7a3f34f72ada4ec862f8814f8df517b5b10869d94834cf2b83f67bb" exitCode=0 Dec 16 08:25:04 crc kubenswrapper[4789]: I1216 08:25:04.692445 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" event={"ID":"27e0476c-3b9a-4129-8376-55b976dbcadc","Type":"ContainerDied","Data":"e541a405f7a3f34f72ada4ec862f8814f8df517b5b10869d94834cf2b83f67bb"} Dec 16 08:25:05 crc kubenswrapper[4789]: I1216 08:25:05.703113 4789 generic.go:334] "Generic (PLEG): container finished" podID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerID="e8095b1180c224ddb88d51c350cac37ddc36ab0449de97607f3297283a2b3247" exitCode=0 Dec 16 08:25:05 crc kubenswrapper[4789]: I1216 08:25:05.703174 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" event={"ID":"27e0476c-3b9a-4129-8376-55b976dbcadc","Type":"ContainerDied","Data":"e8095b1180c224ddb88d51c350cac37ddc36ab0449de97607f3297283a2b3247"} Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.142694 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.157898 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtg6x\" (UniqueName: \"kubernetes.io/projected/27e0476c-3b9a-4129-8376-55b976dbcadc-kube-api-access-wtg6x\") pod \"27e0476c-3b9a-4129-8376-55b976dbcadc\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.158114 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-util\") pod \"27e0476c-3b9a-4129-8376-55b976dbcadc\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.158207 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-bundle\") pod \"27e0476c-3b9a-4129-8376-55b976dbcadc\" (UID: \"27e0476c-3b9a-4129-8376-55b976dbcadc\") " Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.163834 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-bundle" (OuterVolumeSpecName: "bundle") pod "27e0476c-3b9a-4129-8376-55b976dbcadc" (UID: "27e0476c-3b9a-4129-8376-55b976dbcadc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.166097 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e0476c-3b9a-4129-8376-55b976dbcadc-kube-api-access-wtg6x" (OuterVolumeSpecName: "kube-api-access-wtg6x") pod "27e0476c-3b9a-4129-8376-55b976dbcadc" (UID: "27e0476c-3b9a-4129-8376-55b976dbcadc"). InnerVolumeSpecName "kube-api-access-wtg6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.167424 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-util" (OuterVolumeSpecName: "util") pod "27e0476c-3b9a-4129-8376-55b976dbcadc" (UID: "27e0476c-3b9a-4129-8376-55b976dbcadc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.261218 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtg6x\" (UniqueName: \"kubernetes.io/projected/27e0476c-3b9a-4129-8376-55b976dbcadc-kube-api-access-wtg6x\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.261256 4789 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-util\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.261268 4789 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27e0476c-3b9a-4129-8376-55b976dbcadc-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.731900 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" event={"ID":"27e0476c-3b9a-4129-8376-55b976dbcadc","Type":"ContainerDied","Data":"6b8971c8589d752d045b61a2b6db1e6dc1412e60725d5c00ca1b85475cd371e0"} Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.732240 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8971c8589d752d045b61a2b6db1e6dc1412e60725d5c00ca1b85475cd371e0" Dec 16 08:25:07 crc kubenswrapper[4789]: I1216 08:25:07.732330 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.383995 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc"] Dec 16 08:25:19 crc kubenswrapper[4789]: E1216 08:25:19.386532 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.386582 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" Dec 16 08:25:19 crc kubenswrapper[4789]: E1216 08:25:19.386608 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerName="util" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.386639 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerName="util" Dec 16 08:25:19 crc kubenswrapper[4789]: E1216 08:25:19.386667 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerName="pull" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.386676 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerName="pull" Dec 16 08:25:19 crc kubenswrapper[4789]: E1216 08:25:19.386692 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerName="extract" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.386722 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerName="extract" Dec 16 08:25:19 crc kubenswrapper[4789]: E1216 08:25:19.386774 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon-log" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.386807 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon-log" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.387866 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e0476c-3b9a-4129-8376-55b976dbcadc" containerName="extract" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.387905 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.387952 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e431d2d0-98ab-42f4-bb32-3fe3aca72c6b" containerName="horizon-log" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.389460 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.392249 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.392565 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-4x77d" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.392756 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.402481 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.504121 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-272ld\" (UniqueName: \"kubernetes.io/projected/83b9a811-bc86-44be-a5e3-bac352d1f377-kube-api-access-272ld\") pod \"obo-prometheus-operator-668cf9dfbb-klwsc\" (UID: \"83b9a811-bc86-44be-a5e3-bac352d1f377\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.579246 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.580974 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.589770 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-42gcs" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.590044 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.597231 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.598780 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.606185 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-272ld\" (UniqueName: \"kubernetes.io/projected/83b9a811-bc86-44be-a5e3-bac352d1f377-kube-api-access-272ld\") pod \"obo-prometheus-operator-668cf9dfbb-klwsc\" (UID: \"83b9a811-bc86-44be-a5e3-bac352d1f377\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.606768 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.614427 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.709203 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82adb0c3-8b98-4764-b8ca-11eb3c373f16-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-zjsms\" (UID: \"82adb0c3-8b98-4764-b8ca-11eb3c373f16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.709264 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b84c157e-f7a3-4b07-acdb-0f833aa4bdc3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-r28lm\" (UID: \"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.709347 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82adb0c3-8b98-4764-b8ca-11eb3c373f16-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-zjsms\" (UID: \"82adb0c3-8b98-4764-b8ca-11eb3c373f16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.709584 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b84c157e-f7a3-4b07-acdb-0f833aa4bdc3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-r28lm\" (UID: \"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.712882 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-272ld\" (UniqueName: \"kubernetes.io/projected/83b9a811-bc86-44be-a5e3-bac352d1f377-kube-api-access-272ld\") pod \"obo-prometheus-operator-668cf9dfbb-klwsc\" (UID: \"83b9a811-bc86-44be-a5e3-bac352d1f377\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.724823 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.813069 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82adb0c3-8b98-4764-b8ca-11eb3c373f16-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-zjsms\" (UID: \"82adb0c3-8b98-4764-b8ca-11eb3c373f16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.813131 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b84c157e-f7a3-4b07-acdb-0f833aa4bdc3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-r28lm\" (UID: \"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.813237 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82adb0c3-8b98-4764-b8ca-11eb3c373f16-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-zjsms\" (UID: \"82adb0c3-8b98-4764-b8ca-11eb3c373f16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.813300 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b84c157e-f7a3-4b07-acdb-0f833aa4bdc3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-r28lm\" (UID: \"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.817554 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-hktzx"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.823956 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b84c157e-f7a3-4b07-acdb-0f833aa4bdc3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-r28lm\" (UID: \"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.824260 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82adb0c3-8b98-4764-b8ca-11eb3c373f16-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-zjsms\" (UID: \"82adb0c3-8b98-4764-b8ca-11eb3c373f16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.833439 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.848473 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-hktzx"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.850577 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b84c157e-f7a3-4b07-acdb-0f833aa4bdc3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-r28lm\" (UID: \"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.851050 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.851283 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-52np4" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.865565 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82adb0c3-8b98-4764-b8ca-11eb3c373f16-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c85d754c7-zjsms\" (UID: \"82adb0c3-8b98-4764-b8ca-11eb3c373f16\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.909662 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.922260 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.984322 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nmtvf"] Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.985499 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:19 crc kubenswrapper[4789]: I1216 08:25:19.988759 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-v8dlt" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.021292 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnnrm\" (UniqueName: \"kubernetes.io/projected/cab75a30-cb53-4af1-9236-b475e66dcaec-kube-api-access-qnnrm\") pod \"observability-operator-d8bb48f5d-hktzx\" (UID: \"cab75a30-cb53-4af1-9236-b475e66dcaec\") " pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.021636 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cab75a30-cb53-4af1-9236-b475e66dcaec-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-hktzx\" (UID: \"cab75a30-cb53-4af1-9236-b475e66dcaec\") " pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.041960 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nmtvf"] Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.123544 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mb8t\" (UniqueName: \"kubernetes.io/projected/df255ee7-fed8-4845-a2af-49497297cfd4-kube-api-access-8mb8t\") pod \"perses-operator-5446b9c989-nmtvf\" (UID: \"df255ee7-fed8-4845-a2af-49497297cfd4\") " pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.123625 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/df255ee7-fed8-4845-a2af-49497297cfd4-openshift-service-ca\") pod \"perses-operator-5446b9c989-nmtvf\" (UID: \"df255ee7-fed8-4845-a2af-49497297cfd4\") " pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.123670 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnnrm\" (UniqueName: \"kubernetes.io/projected/cab75a30-cb53-4af1-9236-b475e66dcaec-kube-api-access-qnnrm\") pod \"observability-operator-d8bb48f5d-hktzx\" (UID: \"cab75a30-cb53-4af1-9236-b475e66dcaec\") " pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.123733 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cab75a30-cb53-4af1-9236-b475e66dcaec-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-hktzx\" (UID: \"cab75a30-cb53-4af1-9236-b475e66dcaec\") " pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.130287 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cab75a30-cb53-4af1-9236-b475e66dcaec-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-hktzx\" (UID: \"cab75a30-cb53-4af1-9236-b475e66dcaec\") " pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.144705 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnnrm\" (UniqueName: \"kubernetes.io/projected/cab75a30-cb53-4af1-9236-b475e66dcaec-kube-api-access-qnnrm\") pod \"observability-operator-d8bb48f5d-hktzx\" (UID: \"cab75a30-cb53-4af1-9236-b475e66dcaec\") " pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.225146 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mb8t\" (UniqueName: \"kubernetes.io/projected/df255ee7-fed8-4845-a2af-49497297cfd4-kube-api-access-8mb8t\") pod \"perses-operator-5446b9c989-nmtvf\" (UID: \"df255ee7-fed8-4845-a2af-49497297cfd4\") " pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.225580 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/df255ee7-fed8-4845-a2af-49497297cfd4-openshift-service-ca\") pod \"perses-operator-5446b9c989-nmtvf\" (UID: \"df255ee7-fed8-4845-a2af-49497297cfd4\") " pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.226498 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/df255ee7-fed8-4845-a2af-49497297cfd4-openshift-service-ca\") pod \"perses-operator-5446b9c989-nmtvf\" (UID: \"df255ee7-fed8-4845-a2af-49497297cfd4\") " pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.247222 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mb8t\" (UniqueName: \"kubernetes.io/projected/df255ee7-fed8-4845-a2af-49497297cfd4-kube-api-access-8mb8t\") pod \"perses-operator-5446b9c989-nmtvf\" (UID: \"df255ee7-fed8-4845-a2af-49497297cfd4\") " pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.266997 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.388331 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:20 crc kubenswrapper[4789]: W1216 08:25:20.493354 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83b9a811_bc86_44be_a5e3_bac352d1f377.slice/crio-c99ad266b5a0fabe2b3712df8633e02dff31f27c614fe43f48697b1105c0b33b WatchSource:0}: Error finding container c99ad266b5a0fabe2b3712df8633e02dff31f27c614fe43f48697b1105c0b33b: Status 404 returned error can't find the container with id c99ad266b5a0fabe2b3712df8633e02dff31f27c614fe43f48697b1105c0b33b Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.495025 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc"] Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.644900 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm"] Dec 16 08:25:20 crc kubenswrapper[4789]: W1216 08:25:20.655575 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84c157e_f7a3_4b07_acdb_0f833aa4bdc3.slice/crio-045a998031866ab155bdf44ccf0c6cfd7588b0de7c73ae09b340d12f4966dc90 WatchSource:0}: Error finding container 045a998031866ab155bdf44ccf0c6cfd7588b0de7c73ae09b340d12f4966dc90: Status 404 returned error can't find the container with id 045a998031866ab155bdf44ccf0c6cfd7588b0de7c73ae09b340d12f4966dc90 Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.676424 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms"] Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.854508 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-hktzx"] Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.885218 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" event={"ID":"83b9a811-bc86-44be-a5e3-bac352d1f377","Type":"ContainerStarted","Data":"c99ad266b5a0fabe2b3712df8633e02dff31f27c614fe43f48697b1105c0b33b"} Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.894184 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" event={"ID":"cab75a30-cb53-4af1-9236-b475e66dcaec","Type":"ContainerStarted","Data":"3e83f6d1723e09dbcf0ef1c309f79a5eb70f7cc2e44b5b9df6c52dd1321e5c90"} Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.897417 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" event={"ID":"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3","Type":"ContainerStarted","Data":"045a998031866ab155bdf44ccf0c6cfd7588b0de7c73ae09b340d12f4966dc90"} Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.898650 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" event={"ID":"82adb0c3-8b98-4764-b8ca-11eb3c373f16","Type":"ContainerStarted","Data":"0ddcf496a5ee3ce311216907348e3562e63a7d6c75a135addc6dcb180d300800"} Dec 16 08:25:20 crc kubenswrapper[4789]: I1216 08:25:20.959439 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nmtvf"] Dec 16 08:25:20 crc kubenswrapper[4789]: W1216 08:25:20.959880 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf255ee7_fed8_4845_a2af_49497297cfd4.slice/crio-e0a293433585402a242eeef046dc5d0488d79ad967ef6a449aff1c61e7d0c999 WatchSource:0}: Error finding container e0a293433585402a242eeef046dc5d0488d79ad967ef6a449aff1c61e7d0c999: Status 404 returned error can't find the container with id e0a293433585402a242eeef046dc5d0488d79ad967ef6a449aff1c61e7d0c999 Dec 16 08:25:21 crc kubenswrapper[4789]: I1216 08:25:21.925969 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nmtvf" event={"ID":"df255ee7-fed8-4845-a2af-49497297cfd4","Type":"ContainerStarted","Data":"e0a293433585402a242eeef046dc5d0488d79ad967ef6a449aff1c61e7d0c999"} Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.045697 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" event={"ID":"82adb0c3-8b98-4764-b8ca-11eb3c373f16","Type":"ContainerStarted","Data":"174601cdf99120e351ef3ecee03ea5dc90da04b142a365279b28a4c3388a9767"} Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.048800 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nmtvf" event={"ID":"df255ee7-fed8-4845-a2af-49497297cfd4","Type":"ContainerStarted","Data":"0d606e6068d1d05a85f29740f0e81c0b358d804aaa2e662194663d37f9734345"} Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.048953 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.052239 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" event={"ID":"83b9a811-bc86-44be-a5e3-bac352d1f377","Type":"ContainerStarted","Data":"531b436c9ca087a645ca59d86f71b1f914c592a568418730e28fadb44d9fd8ed"} Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.054297 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" event={"ID":"cab75a30-cb53-4af1-9236-b475e66dcaec","Type":"ContainerStarted","Data":"9602f90a8d4ed77b6f2b88c9100f69614739a259d2c10691ea9ed2c259c9735d"} Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.054691 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.056175 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" event={"ID":"b84c157e-f7a3-4b07-acdb-0f833aa4bdc3","Type":"ContainerStarted","Data":"e8db75dc7fdf324ee65c6f9eeacda9044f133eb55cde78c7a819613644469363"} Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.064874 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.071049 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-zjsms" podStartSLOduration=2.672112127 podStartE2EDuration="12.071034428s" podCreationTimestamp="2025-12-16 08:25:19 +0000 UTC" firstStartedPulling="2025-12-16 08:25:20.677612146 +0000 UTC m=+5658.939499775" lastFinishedPulling="2025-12-16 08:25:30.076534457 +0000 UTC m=+5668.338422076" observedRunningTime="2025-12-16 08:25:31.067335317 +0000 UTC m=+5669.329222946" watchObservedRunningTime="2025-12-16 08:25:31.071034428 +0000 UTC m=+5669.332922047" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.093806 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-klwsc" podStartSLOduration=2.510448895 podStartE2EDuration="12.093790194s" podCreationTimestamp="2025-12-16 08:25:19 +0000 UTC" firstStartedPulling="2025-12-16 08:25:20.49777434 +0000 UTC m=+5658.759661969" lastFinishedPulling="2025-12-16 08:25:30.081115639 +0000 UTC m=+5668.343003268" observedRunningTime="2025-12-16 08:25:31.090461393 +0000 UTC m=+5669.352349032" watchObservedRunningTime="2025-12-16 08:25:31.093790194 +0000 UTC m=+5669.355677823" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.153641 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-nmtvf" podStartSLOduration=2.9811508509999998 podStartE2EDuration="12.153620716s" podCreationTimestamp="2025-12-16 08:25:19 +0000 UTC" firstStartedPulling="2025-12-16 08:25:20.962166162 +0000 UTC m=+5659.224053791" lastFinishedPulling="2025-12-16 08:25:30.134636027 +0000 UTC m=+5668.396523656" observedRunningTime="2025-12-16 08:25:31.141620373 +0000 UTC m=+5669.403508002" watchObservedRunningTime="2025-12-16 08:25:31.153620716 +0000 UTC m=+5669.415508345" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.214265 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c85d754c7-r28lm" podStartSLOduration=2.814945009 podStartE2EDuration="12.214240008s" podCreationTimestamp="2025-12-16 08:25:19 +0000 UTC" firstStartedPulling="2025-12-16 08:25:20.678134189 +0000 UTC m=+5658.940021818" lastFinishedPulling="2025-12-16 08:25:30.077429188 +0000 UTC m=+5668.339316817" observedRunningTime="2025-12-16 08:25:31.175639415 +0000 UTC m=+5669.437527044" watchObservedRunningTime="2025-12-16 08:25:31.214240008 +0000 UTC m=+5669.476127637" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.224271 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-hktzx" podStartSLOduration=2.926539387 podStartE2EDuration="12.224250233s" podCreationTimestamp="2025-12-16 08:25:19 +0000 UTC" firstStartedPulling="2025-12-16 08:25:20.86593986 +0000 UTC m=+5659.127827489" lastFinishedPulling="2025-12-16 08:25:30.163650706 +0000 UTC m=+5668.425538335" observedRunningTime="2025-12-16 08:25:31.207306498 +0000 UTC m=+5669.469194127" watchObservedRunningTime="2025-12-16 08:25:31.224250233 +0000 UTC m=+5669.486137862" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.495444 4789 scope.go:117] "RemoveContainer" containerID="c40445bbd449bb5bf768a739e15d6f32b2ba40352aca9ad4001d710bb16d86b8" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.529334 4789 scope.go:117] "RemoveContainer" containerID="3d9e045cc89528945529db22f7098c0ff197a08d8af5ac7f5939c90ad9c9cb81" Dec 16 08:25:31 crc kubenswrapper[4789]: I1216 08:25:31.561351 4789 scope.go:117] "RemoveContainer" containerID="3a8413362262125d355800eb63ed82a1221e13ea56e7e186242c98278b80f337" Dec 16 08:25:37 crc kubenswrapper[4789]: I1216 08:25:37.041509 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-837b-account-create-update-tb5cw"] Dec 16 08:25:37 crc kubenswrapper[4789]: I1216 08:25:37.058966 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lndx6"] Dec 16 08:25:37 crc kubenswrapper[4789]: I1216 08:25:37.061589 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-837b-account-create-update-tb5cw"] Dec 16 08:25:37 crc kubenswrapper[4789]: I1216 08:25:37.069933 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lndx6"] Dec 16 08:25:38 crc kubenswrapper[4789]: I1216 08:25:38.116448 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="606167d1-6cc1-4177-ae73-8369b4bddcf4" path="/var/lib/kubelet/pods/606167d1-6cc1-4177-ae73-8369b4bddcf4/volumes" Dec 16 08:25:38 crc kubenswrapper[4789]: I1216 08:25:38.117327 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc820800-6a99-446e-b203-065a764d80a6" path="/var/lib/kubelet/pods/dc820800-6a99-446e-b203-065a764d80a6/volumes" Dec 16 08:25:40 crc kubenswrapper[4789]: I1216 08:25:40.391836 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-nmtvf" Dec 16 08:25:48 crc kubenswrapper[4789]: I1216 08:25:48.914035 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 08:25:48 crc kubenswrapper[4789]: I1216 08:25:48.914905 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="58061711-79cd-4f55-946d-808fc5787077" containerName="openstackclient" containerID="cri-o://41ecbc1741748ef28f56fd4730b31723aa751a4ab2e121ef8653900d8ed3b220" gracePeriod=2 Dec 16 08:25:48 crc kubenswrapper[4789]: I1216 08:25:48.922581 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 08:25:48 crc kubenswrapper[4789]: I1216 08:25:48.988035 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 08:25:48 crc kubenswrapper[4789]: E1216 08:25:48.988435 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58061711-79cd-4f55-946d-808fc5787077" containerName="openstackclient" Dec 16 08:25:48 crc kubenswrapper[4789]: I1216 08:25:48.988451 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="58061711-79cd-4f55-946d-808fc5787077" containerName="openstackclient" Dec 16 08:25:48 crc kubenswrapper[4789]: I1216 08:25:48.988630 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="58061711-79cd-4f55-946d-808fc5787077" containerName="openstackclient" Dec 16 08:25:48 crc kubenswrapper[4789]: I1216 08:25:48.989291 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.014166 4789 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="58061711-79cd-4f55-946d-808fc5787077" podUID="4731a9c9-432c-4a87-8809-062a075bae7d" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.017150 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.105169 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.106791 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.108993 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jbcjr" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.117859 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.131572 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4731a9c9-432c-4a87-8809-062a075bae7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.131622 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4731a9c9-432c-4a87-8809-062a075bae7d-openstack-config\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.132421 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bxl2\" (UniqueName: \"kubernetes.io/projected/4731a9c9-432c-4a87-8809-062a075bae7d-kube-api-access-2bxl2\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.234283 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4731a9c9-432c-4a87-8809-062a075bae7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.234342 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4731a9c9-432c-4a87-8809-062a075bae7d-openstack-config\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.234380 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v44g\" (UniqueName: \"kubernetes.io/projected/bf90e140-074b-4515-bf9f-827a89acbce4-kube-api-access-2v44g\") pod \"kube-state-metrics-0\" (UID: \"bf90e140-074b-4515-bf9f-827a89acbce4\") " pod="openstack/kube-state-metrics-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.234432 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bxl2\" (UniqueName: \"kubernetes.io/projected/4731a9c9-432c-4a87-8809-062a075bae7d-kube-api-access-2bxl2\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.235471 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4731a9c9-432c-4a87-8809-062a075bae7d-openstack-config\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.245006 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4731a9c9-432c-4a87-8809-062a075bae7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.269033 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bxl2\" (UniqueName: \"kubernetes.io/projected/4731a9c9-432c-4a87-8809-062a075bae7d-kube-api-access-2bxl2\") pod \"openstackclient\" (UID: \"4731a9c9-432c-4a87-8809-062a075bae7d\") " pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.318045 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.339626 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v44g\" (UniqueName: \"kubernetes.io/projected/bf90e140-074b-4515-bf9f-827a89acbce4-kube-api-access-2v44g\") pod \"kube-state-metrics-0\" (UID: \"bf90e140-074b-4515-bf9f-827a89acbce4\") " pod="openstack/kube-state-metrics-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.361143 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v44g\" (UniqueName: \"kubernetes.io/projected/bf90e140-074b-4515-bf9f-827a89acbce4-kube-api-access-2v44g\") pod \"kube-state-metrics-0\" (UID: \"bf90e140-074b-4515-bf9f-827a89acbce4\") " pod="openstack/kube-state-metrics-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.422566 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.825948 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.829664 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.832079 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.837445 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.837612 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-7thh9" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.839545 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.840488 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.849214 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.858313 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/519395da-f8ab-483d-ae5f-adb8e234939d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.858374 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.858431 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qr5\" (UniqueName: \"kubernetes.io/projected/519395da-f8ab-483d-ae5f-adb8e234939d-kube-api-access-d8qr5\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.858464 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.858487 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/519395da-f8ab-483d-ae5f-adb8e234939d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.858517 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.858648 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/519395da-f8ab-483d-ae5f-adb8e234939d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.960701 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.960804 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qr5\" (UniqueName: \"kubernetes.io/projected/519395da-f8ab-483d-ae5f-adb8e234939d-kube-api-access-d8qr5\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.960837 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.960856 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/519395da-f8ab-483d-ae5f-adb8e234939d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.960881 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.960934 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/519395da-f8ab-483d-ae5f-adb8e234939d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.961046 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/519395da-f8ab-483d-ae5f-adb8e234939d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.963732 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/519395da-f8ab-483d-ae5f-adb8e234939d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.976341 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/519395da-f8ab-483d-ae5f-adb8e234939d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.976346 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/519395da-f8ab-483d-ae5f-adb8e234939d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.976497 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.979690 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:49 crc kubenswrapper[4789]: I1216 08:25:49.983486 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/519395da-f8ab-483d-ae5f-adb8e234939d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.001549 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qr5\" (UniqueName: \"kubernetes.io/projected/519395da-f8ab-483d-ae5f-adb8e234939d-kube-api-access-d8qr5\") pod \"alertmanager-metric-storage-0\" (UID: \"519395da-f8ab-483d-ae5f-adb8e234939d\") " pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.007302 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.178344 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.192600 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.237341 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4731a9c9-432c-4a87-8809-062a075bae7d","Type":"ContainerStarted","Data":"7335e1aa4370c0e4831e242255d7419d463f03987d5a5415d6049a42e24cf533"} Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.239553 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf90e140-074b-4515-bf9f-827a89acbce4","Type":"ContainerStarted","Data":"be3f7f856f8300cb420457bd5d5d210c10ef821412ee2cd3e56ada4414d087c6"} Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.452079 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.456635 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.458961 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-q4ml4" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.459159 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.459357 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.459588 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.460443 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.460760 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.468201 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475344 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a938c46-1e0f-4af2-873c-de4472f39a2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475450 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475484 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a938c46-1e0f-4af2-873c-de4472f39a2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475543 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a938c46-1e0f-4af2-873c-de4472f39a2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475601 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvlr\" (UniqueName: \"kubernetes.io/projected/1a938c46-1e0f-4af2-873c-de4472f39a2b-kube-api-access-nmvlr\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475772 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.475821 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.577419 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a938c46-1e0f-4af2-873c-de4472f39a2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.577752 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a938c46-1e0f-4af2-873c-de4472f39a2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.577793 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.577823 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvlr\" (UniqueName: \"kubernetes.io/projected/1a938c46-1e0f-4af2-873c-de4472f39a2b-kube-api-access-nmvlr\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.577902 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.577949 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.578035 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a938c46-1e0f-4af2-873c-de4472f39a2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.578091 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.584772 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.586897 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a938c46-1e0f-4af2-873c-de4472f39a2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.587155 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a938c46-1e0f-4af2-873c-de4472f39a2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.589452 4789 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.589482 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/80ae856d34d66cab4813c4fee9b6cfad872914ffa16585284a18ea76b3f4a010/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.590066 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.591689 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a938c46-1e0f-4af2-873c-de4472f39a2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.592042 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a938c46-1e0f-4af2-873c-de4472f39a2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.600468 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvlr\" (UniqueName: \"kubernetes.io/projected/1a938c46-1e0f-4af2-873c-de4472f39a2b-kube-api-access-nmvlr\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.633424 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e2ffe9e-67fd-4ab0-aee0-d26bd8cd4b5b\") pod \"prometheus-metric-storage-0\" (UID: \"1a938c46-1e0f-4af2-873c-de4472f39a2b\") " pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.764813 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 16 08:25:50 crc kubenswrapper[4789]: W1216 08:25:50.769938 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519395da_f8ab_483d_ae5f_adb8e234939d.slice/crio-f6d8363151306117e5a73051194ac96ac3fd8f1bcf2faa978919523d5df017a1 WatchSource:0}: Error finding container f6d8363151306117e5a73051194ac96ac3fd8f1bcf2faa978919523d5df017a1: Status 404 returned error can't find the container with id f6d8363151306117e5a73051194ac96ac3fd8f1bcf2faa978919523d5df017a1 Dec 16 08:25:50 crc kubenswrapper[4789]: I1216 08:25:50.774212 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.266289 4789 generic.go:334] "Generic (PLEG): container finished" podID="58061711-79cd-4f55-946d-808fc5787077" containerID="41ecbc1741748ef28f56fd4730b31723aa751a4ab2e121ef8653900d8ed3b220" exitCode=137 Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.274900 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf90e140-074b-4515-bf9f-827a89acbce4","Type":"ContainerStarted","Data":"a6162e0ce72e99a38f9a4b8ec7eaface04652e8e95ca23afee617a3856e9f7bd"} Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.275082 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.279940 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"519395da-f8ab-483d-ae5f-adb8e234939d","Type":"ContainerStarted","Data":"f6d8363151306117e5a73051194ac96ac3fd8f1bcf2faa978919523d5df017a1"} Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.286965 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4731a9c9-432c-4a87-8809-062a075bae7d","Type":"ContainerStarted","Data":"e31754ac0848d2789c3b50afff955862fbbd83dcaffa1f2c477850025aa5fd81"} Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.318086 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.927551287 podStartE2EDuration="2.318066534s" podCreationTimestamp="2025-12-16 08:25:49 +0000 UTC" firstStartedPulling="2025-12-16 08:25:50.021621252 +0000 UTC m=+5688.283508881" lastFinishedPulling="2025-12-16 08:25:50.412136499 +0000 UTC m=+5688.674024128" observedRunningTime="2025-12-16 08:25:51.303689913 +0000 UTC m=+5689.565577542" watchObservedRunningTime="2025-12-16 08:25:51.318066534 +0000 UTC m=+5689.579954163" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.326562 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.326540501 podStartE2EDuration="3.326540501s" podCreationTimestamp="2025-12-16 08:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:25:51.322696057 +0000 UTC m=+5689.584583686" watchObservedRunningTime="2025-12-16 08:25:51.326540501 +0000 UTC m=+5689.588428130" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.412757 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 16 08:25:51 crc kubenswrapper[4789]: W1216 08:25:51.422384 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a938c46_1e0f_4af2_873c_de4472f39a2b.slice/crio-3b02557fbf48b8c595be17806099ad067a6dc20ff7b5b8aa51acc1802b79d083 WatchSource:0}: Error finding container 3b02557fbf48b8c595be17806099ad067a6dc20ff7b5b8aa51acc1802b79d083: Status 404 returned error can't find the container with id 3b02557fbf48b8c595be17806099ad067a6dc20ff7b5b8aa51acc1802b79d083 Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.476678 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.524685 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn7qn\" (UniqueName: \"kubernetes.io/projected/58061711-79cd-4f55-946d-808fc5787077-kube-api-access-gn7qn\") pod \"58061711-79cd-4f55-946d-808fc5787077\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.525674 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58061711-79cd-4f55-946d-808fc5787077-openstack-config-secret\") pod \"58061711-79cd-4f55-946d-808fc5787077\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.525927 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58061711-79cd-4f55-946d-808fc5787077-openstack-config\") pod \"58061711-79cd-4f55-946d-808fc5787077\" (UID: \"58061711-79cd-4f55-946d-808fc5787077\") " Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.530615 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58061711-79cd-4f55-946d-808fc5787077-kube-api-access-gn7qn" (OuterVolumeSpecName: "kube-api-access-gn7qn") pod "58061711-79cd-4f55-946d-808fc5787077" (UID: "58061711-79cd-4f55-946d-808fc5787077"). InnerVolumeSpecName "kube-api-access-gn7qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.562813 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58061711-79cd-4f55-946d-808fc5787077-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "58061711-79cd-4f55-946d-808fc5787077" (UID: "58061711-79cd-4f55-946d-808fc5787077"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.609564 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58061711-79cd-4f55-946d-808fc5787077-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "58061711-79cd-4f55-946d-808fc5787077" (UID: "58061711-79cd-4f55-946d-808fc5787077"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.629782 4789 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/58061711-79cd-4f55-946d-808fc5787077-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.630117 4789 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/58061711-79cd-4f55-946d-808fc5787077-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:51 crc kubenswrapper[4789]: I1216 08:25:51.630132 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn7qn\" (UniqueName: \"kubernetes.io/projected/58061711-79cd-4f55-946d-808fc5787077-kube-api-access-gn7qn\") on node \"crc\" DevicePath \"\"" Dec 16 08:25:52 crc kubenswrapper[4789]: I1216 08:25:52.116249 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58061711-79cd-4f55-946d-808fc5787077" path="/var/lib/kubelet/pods/58061711-79cd-4f55-946d-808fc5787077/volumes" Dec 16 08:25:52 crc kubenswrapper[4789]: I1216 08:25:52.294129 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a938c46-1e0f-4af2-873c-de4472f39a2b","Type":"ContainerStarted","Data":"3b02557fbf48b8c595be17806099ad067a6dc20ff7b5b8aa51acc1802b79d083"} Dec 16 08:25:52 crc kubenswrapper[4789]: I1216 08:25:52.296383 4789 scope.go:117] "RemoveContainer" containerID="41ecbc1741748ef28f56fd4730b31723aa751a4ab2e121ef8653900d8ed3b220" Dec 16 08:25:52 crc kubenswrapper[4789]: I1216 08:25:52.296475 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 08:25:57 crc kubenswrapper[4789]: I1216 08:25:57.338417 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"519395da-f8ab-483d-ae5f-adb8e234939d","Type":"ContainerStarted","Data":"83ea22f5e8b5c3828106585c7ad8aa1b2d9e8d3dbe4142c61ab4c38f105bf125"} Dec 16 08:25:57 crc kubenswrapper[4789]: I1216 08:25:57.340660 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a938c46-1e0f-4af2-873c-de4472f39a2b","Type":"ContainerStarted","Data":"ce113539edda2d6e558802e705c4d06c9cb472895ee67e62e8d95c51b29b6f74"} Dec 16 08:25:59 crc kubenswrapper[4789]: I1216 08:25:59.426502 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 08:26:03 crc kubenswrapper[4789]: I1216 08:26:03.392522 4789 generic.go:334] "Generic (PLEG): container finished" podID="519395da-f8ab-483d-ae5f-adb8e234939d" containerID="83ea22f5e8b5c3828106585c7ad8aa1b2d9e8d3dbe4142c61ab4c38f105bf125" exitCode=0 Dec 16 08:26:03 crc kubenswrapper[4789]: I1216 08:26:03.392609 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"519395da-f8ab-483d-ae5f-adb8e234939d","Type":"ContainerDied","Data":"83ea22f5e8b5c3828106585c7ad8aa1b2d9e8d3dbe4142c61ab4c38f105bf125"} Dec 16 08:26:03 crc kubenswrapper[4789]: I1216 08:26:03.395094 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:26:03 crc kubenswrapper[4789]: I1216 08:26:03.399616 4789 generic.go:334] "Generic (PLEG): container finished" podID="1a938c46-1e0f-4af2-873c-de4472f39a2b" containerID="ce113539edda2d6e558802e705c4d06c9cb472895ee67e62e8d95c51b29b6f74" exitCode=0 Dec 16 08:26:03 crc kubenswrapper[4789]: I1216 08:26:03.399660 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a938c46-1e0f-4af2-873c-de4472f39a2b","Type":"ContainerDied","Data":"ce113539edda2d6e558802e705c4d06c9cb472895ee67e62e8d95c51b29b6f74"} Dec 16 08:26:04 crc kubenswrapper[4789]: I1216 08:26:04.045874 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ffxcd"] Dec 16 08:26:04 crc kubenswrapper[4789]: I1216 08:26:04.056762 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ffxcd"] Dec 16 08:26:04 crc kubenswrapper[4789]: I1216 08:26:04.120247 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6387d858-ac8d-4d8d-b910-f12598ffc6bb" path="/var/lib/kubelet/pods/6387d858-ac8d-4d8d-b910-f12598ffc6bb/volumes" Dec 16 08:26:07 crc kubenswrapper[4789]: I1216 08:26:07.453788 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"519395da-f8ab-483d-ae5f-adb8e234939d","Type":"ContainerStarted","Data":"497667af59ba6021cb06e932f6b6635a9d997db8b5353ce5df51327f3f455613"} Dec 16 08:26:12 crc kubenswrapper[4789]: I1216 08:26:12.502040 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"519395da-f8ab-483d-ae5f-adb8e234939d","Type":"ContainerStarted","Data":"094466ce796dc50db0c3a4f5ac344c9bcb22a19dba6d4da649e317c9a56da647"} Dec 16 08:26:12 crc kubenswrapper[4789]: I1216 08:26:12.503461 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 16 08:26:12 crc kubenswrapper[4789]: I1216 08:26:12.505658 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 16 08:26:12 crc kubenswrapper[4789]: I1216 08:26:12.526471 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.002861601 podStartE2EDuration="23.526448842s" podCreationTimestamp="2025-12-16 08:25:49 +0000 UTC" firstStartedPulling="2025-12-16 08:25:50.773243416 +0000 UTC m=+5689.035131045" lastFinishedPulling="2025-12-16 08:26:06.296830657 +0000 UTC m=+5704.558718286" observedRunningTime="2025-12-16 08:26:12.521655266 +0000 UTC m=+5710.783542895" watchObservedRunningTime="2025-12-16 08:26:12.526448842 +0000 UTC m=+5710.788336471" Dec 16 08:26:13 crc kubenswrapper[4789]: I1216 08:26:13.519639 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a938c46-1e0f-4af2-873c-de4472f39a2b","Type":"ContainerStarted","Data":"be11ebc587f475d9d6a71b7792292e3d7d0dea2905f86ba1c4e1ffd30df9391a"} Dec 16 08:26:16 crc kubenswrapper[4789]: I1216 08:26:16.548948 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a938c46-1e0f-4af2-873c-de4472f39a2b","Type":"ContainerStarted","Data":"91ef0b3cb8059fd5f07822c272c25c5b04aaf451f697cb02587424130e4edab0"} Dec 16 08:26:19 crc kubenswrapper[4789]: I1216 08:26:19.581794 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1a938c46-1e0f-4af2-873c-de4472f39a2b","Type":"ContainerStarted","Data":"fbb69a547b64cd9d04d37cefb617b1b1335f9d3b672437e9fa4eca8ec5de8531"} Dec 16 08:26:19 crc kubenswrapper[4789]: I1216 08:26:19.605854 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=2.921330801 podStartE2EDuration="30.605837311s" podCreationTimestamp="2025-12-16 08:25:49 +0000 UTC" firstStartedPulling="2025-12-16 08:25:51.424755132 +0000 UTC m=+5689.686642761" lastFinishedPulling="2025-12-16 08:26:19.109261642 +0000 UTC m=+5717.371149271" observedRunningTime="2025-12-16 08:26:19.603648538 +0000 UTC m=+5717.865536187" watchObservedRunningTime="2025-12-16 08:26:19.605837311 +0000 UTC m=+5717.867724940" Dec 16 08:26:20 crc kubenswrapper[4789]: I1216 08:26:20.775434 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 16 08:26:20 crc kubenswrapper[4789]: I1216 08:26:20.775769 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 16 08:26:20 crc kubenswrapper[4789]: I1216 08:26:20.777504 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 16 08:26:21 crc kubenswrapper[4789]: I1216 08:26:21.600420 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.618607 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.621689 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.623728 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.624481 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.638929 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.781516 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.781563 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.781599 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-log-httpd\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.781620 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762z5\" (UniqueName: \"kubernetes.io/projected/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-kube-api-access-762z5\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.781667 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-run-httpd\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.781685 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-scripts\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.781753 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-config-data\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.884472 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.884531 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.884574 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-log-httpd\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.884598 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-762z5\" (UniqueName: \"kubernetes.io/projected/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-kube-api-access-762z5\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.884657 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-run-httpd\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.884686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-scripts\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.885105 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-log-httpd\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.885136 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-run-httpd\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.887333 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-config-data\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.897821 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-scripts\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.902767 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.903185 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-config-data\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.903711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-762z5\" (UniqueName: \"kubernetes.io/projected/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-kube-api-access-762z5\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.903693 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " pod="openstack/ceilometer-0" Dec 16 08:26:23 crc kubenswrapper[4789]: I1216 08:26:23.945111 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:26:24 crc kubenswrapper[4789]: W1216 08:26:24.431808 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb19174f6_911d_4d14_a9b1_c7e6c555f4ac.slice/crio-7086d2765987ff1a73a273d38a0987adcfb085174d0a86cd40b3571bf6d445e4 WatchSource:0}: Error finding container 7086d2765987ff1a73a273d38a0987adcfb085174d0a86cd40b3571bf6d445e4: Status 404 returned error can't find the container with id 7086d2765987ff1a73a273d38a0987adcfb085174d0a86cd40b3571bf6d445e4 Dec 16 08:26:24 crc kubenswrapper[4789]: I1216 08:26:24.437494 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:26:24 crc kubenswrapper[4789]: I1216 08:26:24.640204 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerStarted","Data":"7086d2765987ff1a73a273d38a0987adcfb085174d0a86cd40b3571bf6d445e4"} Dec 16 08:26:29 crc kubenswrapper[4789]: I1216 08:26:29.688768 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerStarted","Data":"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2"} Dec 16 08:26:31 crc kubenswrapper[4789]: I1216 08:26:31.699619 4789 scope.go:117] "RemoveContainer" containerID="2357d778a830c0b83497163681744e0d8eb8154927b1afdeee85558169b24107" Dec 16 08:26:31 crc kubenswrapper[4789]: I1216 08:26:31.709961 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerStarted","Data":"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005"} Dec 16 08:26:31 crc kubenswrapper[4789]: I1216 08:26:31.728478 4789 scope.go:117] "RemoveContainer" containerID="90919f111ce167d64c47fd72e53d47d04e018b50823f690529d052457320815e" Dec 16 08:26:31 crc kubenswrapper[4789]: I1216 08:26:31.796375 4789 scope.go:117] "RemoveContainer" containerID="f3c6fe2b94bac5470c865964e9caed105820fecaf565aa48d53ed7b52ec234fb" Dec 16 08:26:32 crc kubenswrapper[4789]: I1216 08:26:32.720671 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerStarted","Data":"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb"} Dec 16 08:26:33 crc kubenswrapper[4789]: I1216 08:26:33.745164 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerStarted","Data":"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805"} Dec 16 08:26:33 crc kubenswrapper[4789]: I1216 08:26:33.745750 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 08:26:33 crc kubenswrapper[4789]: I1216 08:26:33.778680 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.221463347 podStartE2EDuration="10.778664341s" podCreationTimestamp="2025-12-16 08:26:23 +0000 UTC" firstStartedPulling="2025-12-16 08:26:24.434743145 +0000 UTC m=+5722.696630774" lastFinishedPulling="2025-12-16 08:26:32.991944139 +0000 UTC m=+5731.253831768" observedRunningTime="2025-12-16 08:26:33.765403617 +0000 UTC m=+5732.027291246" watchObservedRunningTime="2025-12-16 08:26:33.778664341 +0000 UTC m=+5732.040551970" Dec 16 08:26:34 crc kubenswrapper[4789]: I1216 08:26:34.037086 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-355b-account-create-update-fc9kb"] Dec 16 08:26:34 crc kubenswrapper[4789]: I1216 08:26:34.047325 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zpkbs"] Dec 16 08:26:34 crc kubenswrapper[4789]: I1216 08:26:34.056623 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-355b-account-create-update-fc9kb"] Dec 16 08:26:34 crc kubenswrapper[4789]: I1216 08:26:34.066099 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zpkbs"] Dec 16 08:26:34 crc kubenswrapper[4789]: I1216 08:26:34.117171 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1e854f-a5fd-4cf1-9f74-dcef650d90b4" path="/var/lib/kubelet/pods/ac1e854f-a5fd-4cf1-9f74-dcef650d90b4/volumes" Dec 16 08:26:34 crc kubenswrapper[4789]: I1216 08:26:34.117854 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f" path="/var/lib/kubelet/pods/e70fb015-ac29-4a0e-a7ea-eb5cd4d9690f/volumes" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.498074 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-f8mvk"] Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.500147 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.514423 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f8mvk"] Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.603358 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-61a3-account-create-update-vc728"] Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.604895 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.607620 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.609064 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnr4q\" (UniqueName: \"kubernetes.io/projected/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-kube-api-access-dnr4q\") pod \"aodh-db-create-f8mvk\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.609110 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-operator-scripts\") pod \"aodh-db-create-f8mvk\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.615413 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-61a3-account-create-update-vc728"] Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.717118 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c071c8a0-c255-478a-aceb-305f7c8139a5-operator-scripts\") pod \"aodh-61a3-account-create-update-vc728\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.717177 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8pvp\" (UniqueName: \"kubernetes.io/projected/c071c8a0-c255-478a-aceb-305f7c8139a5-kube-api-access-p8pvp\") pod \"aodh-61a3-account-create-update-vc728\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.717250 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnr4q\" (UniqueName: \"kubernetes.io/projected/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-kube-api-access-dnr4q\") pod \"aodh-db-create-f8mvk\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.717288 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-operator-scripts\") pod \"aodh-db-create-f8mvk\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.718052 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-operator-scripts\") pod \"aodh-db-create-f8mvk\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.736739 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnr4q\" (UniqueName: \"kubernetes.io/projected/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-kube-api-access-dnr4q\") pod \"aodh-db-create-f8mvk\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.819171 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c071c8a0-c255-478a-aceb-305f7c8139a5-operator-scripts\") pod \"aodh-61a3-account-create-update-vc728\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.819214 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8pvp\" (UniqueName: \"kubernetes.io/projected/c071c8a0-c255-478a-aceb-305f7c8139a5-kube-api-access-p8pvp\") pod \"aodh-61a3-account-create-update-vc728\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.820365 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c071c8a0-c255-478a-aceb-305f7c8139a5-operator-scripts\") pod \"aodh-61a3-account-create-update-vc728\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.822843 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.838568 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8pvp\" (UniqueName: \"kubernetes.io/projected/c071c8a0-c255-478a-aceb-305f7c8139a5-kube-api-access-p8pvp\") pod \"aodh-61a3-account-create-update-vc728\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:39 crc kubenswrapper[4789]: I1216 08:26:39.922643 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:40 crc kubenswrapper[4789]: I1216 08:26:40.367893 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f8mvk"] Dec 16 08:26:40 crc kubenswrapper[4789]: I1216 08:26:40.526431 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-61a3-account-create-update-vc728"] Dec 16 08:26:40 crc kubenswrapper[4789]: I1216 08:26:40.814807 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f8mvk" event={"ID":"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3","Type":"ContainerStarted","Data":"3c758e31a5c12b6fef1afa097e767c69ba3c262a2787170676b55ee334675234"} Dec 16 08:26:40 crc kubenswrapper[4789]: I1216 08:26:40.816216 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-61a3-account-create-update-vc728" event={"ID":"c071c8a0-c255-478a-aceb-305f7c8139a5","Type":"ContainerStarted","Data":"2a4912387d2568267ab96d7e34f7afc96930e3552e58342f7553ead8576366ad"} Dec 16 08:26:41 crc kubenswrapper[4789]: I1216 08:26:41.826324 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f8mvk" event={"ID":"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3","Type":"ContainerStarted","Data":"52f54f0bab32d34879fc2cb96a8e08be29093809c7cecc771e40d6362b3548fd"} Dec 16 08:26:41 crc kubenswrapper[4789]: I1216 08:26:41.828663 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-61a3-account-create-update-vc728" event={"ID":"c071c8a0-c255-478a-aceb-305f7c8139a5","Type":"ContainerStarted","Data":"6e3986ef0345e0cc99bacdcdd2db04d15c36b734615cfa4ffbf4ab212107d50e"} Dec 16 08:26:41 crc kubenswrapper[4789]: I1216 08:26:41.845044 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-f8mvk" podStartSLOduration=2.845027896 podStartE2EDuration="2.845027896s" podCreationTimestamp="2025-12-16 08:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:26:41.839951552 +0000 UTC m=+5740.101839191" watchObservedRunningTime="2025-12-16 08:26:41.845027896 +0000 UTC m=+5740.106915525" Dec 16 08:26:41 crc kubenswrapper[4789]: I1216 08:26:41.867896 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-61a3-account-create-update-vc728" podStartSLOduration=2.8678734649999997 podStartE2EDuration="2.867873465s" podCreationTimestamp="2025-12-16 08:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:26:41.859611442 +0000 UTC m=+5740.121499071" watchObservedRunningTime="2025-12-16 08:26:41.867873465 +0000 UTC m=+5740.129761094" Dec 16 08:26:42 crc kubenswrapper[4789]: I1216 08:26:42.837275 4789 generic.go:334] "Generic (PLEG): container finished" podID="c071c8a0-c255-478a-aceb-305f7c8139a5" containerID="6e3986ef0345e0cc99bacdcdd2db04d15c36b734615cfa4ffbf4ab212107d50e" exitCode=0 Dec 16 08:26:42 crc kubenswrapper[4789]: I1216 08:26:42.837382 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-61a3-account-create-update-vc728" event={"ID":"c071c8a0-c255-478a-aceb-305f7c8139a5","Type":"ContainerDied","Data":"6e3986ef0345e0cc99bacdcdd2db04d15c36b734615cfa4ffbf4ab212107d50e"} Dec 16 08:26:42 crc kubenswrapper[4789]: I1216 08:26:42.839082 4789 generic.go:334] "Generic (PLEG): container finished" podID="bcd73d42-7e0f-4b63-9f9e-cd042b827fe3" containerID="52f54f0bab32d34879fc2cb96a8e08be29093809c7cecc771e40d6362b3548fd" exitCode=0 Dec 16 08:26:42 crc kubenswrapper[4789]: I1216 08:26:42.839120 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f8mvk" event={"ID":"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3","Type":"ContainerDied","Data":"52f54f0bab32d34879fc2cb96a8e08be29093809c7cecc771e40d6362b3548fd"} Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.038051 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gdnmc"] Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.047755 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gdnmc"] Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.118949 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06be30e-36e2-466a-9caa-2cd41c0f4bb6" path="/var/lib/kubelet/pods/e06be30e-36e2-466a-9caa-2cd41c0f4bb6/volumes" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.349805 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.357063 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.421229 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c071c8a0-c255-478a-aceb-305f7c8139a5-operator-scripts\") pod \"c071c8a0-c255-478a-aceb-305f7c8139a5\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.421415 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8pvp\" (UniqueName: \"kubernetes.io/projected/c071c8a0-c255-478a-aceb-305f7c8139a5-kube-api-access-p8pvp\") pod \"c071c8a0-c255-478a-aceb-305f7c8139a5\" (UID: \"c071c8a0-c255-478a-aceb-305f7c8139a5\") " Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.423849 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c071c8a0-c255-478a-aceb-305f7c8139a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c071c8a0-c255-478a-aceb-305f7c8139a5" (UID: "c071c8a0-c255-478a-aceb-305f7c8139a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.438774 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c071c8a0-c255-478a-aceb-305f7c8139a5-kube-api-access-p8pvp" (OuterVolumeSpecName: "kube-api-access-p8pvp") pod "c071c8a0-c255-478a-aceb-305f7c8139a5" (UID: "c071c8a0-c255-478a-aceb-305f7c8139a5"). InnerVolumeSpecName "kube-api-access-p8pvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.523142 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-operator-scripts\") pod \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.523267 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnr4q\" (UniqueName: \"kubernetes.io/projected/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-kube-api-access-dnr4q\") pod \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\" (UID: \"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3\") " Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.523839 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bcd73d42-7e0f-4b63-9f9e-cd042b827fe3" (UID: "bcd73d42-7e0f-4b63-9f9e-cd042b827fe3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.524508 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.524527 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8pvp\" (UniqueName: \"kubernetes.io/projected/c071c8a0-c255-478a-aceb-305f7c8139a5-kube-api-access-p8pvp\") on node \"crc\" DevicePath \"\"" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.524538 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c071c8a0-c255-478a-aceb-305f7c8139a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.530333 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-kube-api-access-dnr4q" (OuterVolumeSpecName: "kube-api-access-dnr4q") pod "bcd73d42-7e0f-4b63-9f9e-cd042b827fe3" (UID: "bcd73d42-7e0f-4b63-9f9e-cd042b827fe3"). InnerVolumeSpecName "kube-api-access-dnr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.626112 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnr4q\" (UniqueName: \"kubernetes.io/projected/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3-kube-api-access-dnr4q\") on node \"crc\" DevicePath \"\"" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.855522 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f8mvk" event={"ID":"bcd73d42-7e0f-4b63-9f9e-cd042b827fe3","Type":"ContainerDied","Data":"3c758e31a5c12b6fef1afa097e767c69ba3c262a2787170676b55ee334675234"} Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.855563 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c758e31a5c12b6fef1afa097e767c69ba3c262a2787170676b55ee334675234" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.855562 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f8mvk" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.857087 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-61a3-account-create-update-vc728" event={"ID":"c071c8a0-c255-478a-aceb-305f7c8139a5","Type":"ContainerDied","Data":"2a4912387d2568267ab96d7e34f7afc96930e3552e58342f7553ead8576366ad"} Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.857119 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-61a3-account-create-update-vc728" Dec 16 08:26:44 crc kubenswrapper[4789]: I1216 08:26:44.857117 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a4912387d2568267ab96d7e34f7afc96930e3552e58342f7553ead8576366ad" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.940372 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-bv2lw"] Dec 16 08:26:49 crc kubenswrapper[4789]: E1216 08:26:49.941477 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c071c8a0-c255-478a-aceb-305f7c8139a5" containerName="mariadb-account-create-update" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.941490 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c071c8a0-c255-478a-aceb-305f7c8139a5" containerName="mariadb-account-create-update" Dec 16 08:26:49 crc kubenswrapper[4789]: E1216 08:26:49.941500 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd73d42-7e0f-4b63-9f9e-cd042b827fe3" containerName="mariadb-database-create" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.941507 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd73d42-7e0f-4b63-9f9e-cd042b827fe3" containerName="mariadb-database-create" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.941686 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd73d42-7e0f-4b63-9f9e-cd042b827fe3" containerName="mariadb-database-create" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.941712 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c071c8a0-c255-478a-aceb-305f7c8139a5" containerName="mariadb-account-create-update" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.942508 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.946430 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5tmwz" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.946492 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.946507 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.946609 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 08:26:49 crc kubenswrapper[4789]: I1216 08:26:49.951788 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bv2lw"] Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.034053 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-config-data\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.034232 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-scripts\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.034488 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-combined-ca-bundle\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.034571 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjn56\" (UniqueName: \"kubernetes.io/projected/8dc1a722-c545-4ed5-b458-e8c9864a1e19-kube-api-access-cjn56\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.135955 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-scripts\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.136050 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-combined-ca-bundle\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.136093 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjn56\" (UniqueName: \"kubernetes.io/projected/8dc1a722-c545-4ed5-b458-e8c9864a1e19-kube-api-access-cjn56\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.136211 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-config-data\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.143808 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-combined-ca-bundle\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.145043 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-scripts\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.146399 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-config-data\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.172563 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjn56\" (UniqueName: \"kubernetes.io/projected/8dc1a722-c545-4ed5-b458-e8c9864a1e19-kube-api-access-cjn56\") pod \"aodh-db-sync-bv2lw\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.283710 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.792548 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bv2lw"] Dec 16 08:26:50 crc kubenswrapper[4789]: W1216 08:26:50.793381 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dc1a722_c545_4ed5_b458_e8c9864a1e19.slice/crio-9ea64d403f3d9c0172880f65e43e043f07b75ac7b79052e7b18c5a0755559117 WatchSource:0}: Error finding container 9ea64d403f3d9c0172880f65e43e043f07b75ac7b79052e7b18c5a0755559117: Status 404 returned error can't find the container with id 9ea64d403f3d9c0172880f65e43e043f07b75ac7b79052e7b18c5a0755559117 Dec 16 08:26:50 crc kubenswrapper[4789]: I1216 08:26:50.918533 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bv2lw" event={"ID":"8dc1a722-c545-4ed5-b458-e8c9864a1e19","Type":"ContainerStarted","Data":"9ea64d403f3d9c0172880f65e43e043f07b75ac7b79052e7b18c5a0755559117"} Dec 16 08:26:51 crc kubenswrapper[4789]: I1216 08:26:51.927553 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:26:51 crc kubenswrapper[4789]: I1216 08:26:51.927846 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:26:53 crc kubenswrapper[4789]: I1216 08:26:53.951554 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 08:26:56 crc kubenswrapper[4789]: I1216 08:26:56.984010 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bv2lw" event={"ID":"8dc1a722-c545-4ed5-b458-e8c9864a1e19","Type":"ContainerStarted","Data":"83b7e30363c240633f6e9c730caca7e2cf5e9acbc70da3518a11409db100ca54"} Dec 16 08:26:56 crc kubenswrapper[4789]: I1216 08:26:56.999236 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-bv2lw" podStartSLOduration=2.531640009 podStartE2EDuration="7.999219386s" podCreationTimestamp="2025-12-16 08:26:49 +0000 UTC" firstStartedPulling="2025-12-16 08:26:50.800011405 +0000 UTC m=+5749.061899034" lastFinishedPulling="2025-12-16 08:26:56.267590782 +0000 UTC m=+5754.529478411" observedRunningTime="2025-12-16 08:26:56.997136606 +0000 UTC m=+5755.259024245" watchObservedRunningTime="2025-12-16 08:26:56.999219386 +0000 UTC m=+5755.261107015" Dec 16 08:26:59 crc kubenswrapper[4789]: I1216 08:26:59.003943 4789 generic.go:334] "Generic (PLEG): container finished" podID="8dc1a722-c545-4ed5-b458-e8c9864a1e19" containerID="83b7e30363c240633f6e9c730caca7e2cf5e9acbc70da3518a11409db100ca54" exitCode=0 Dec 16 08:26:59 crc kubenswrapper[4789]: I1216 08:26:59.004146 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bv2lw" event={"ID":"8dc1a722-c545-4ed5-b458-e8c9864a1e19","Type":"ContainerDied","Data":"83b7e30363c240633f6e9c730caca7e2cf5e9acbc70da3518a11409db100ca54"} Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.416065 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.477301 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjn56\" (UniqueName: \"kubernetes.io/projected/8dc1a722-c545-4ed5-b458-e8c9864a1e19-kube-api-access-cjn56\") pod \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.477382 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-config-data\") pod \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.477565 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-combined-ca-bundle\") pod \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.477618 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-scripts\") pod \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\" (UID: \"8dc1a722-c545-4ed5-b458-e8c9864a1e19\") " Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.482707 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc1a722-c545-4ed5-b458-e8c9864a1e19-kube-api-access-cjn56" (OuterVolumeSpecName: "kube-api-access-cjn56") pod "8dc1a722-c545-4ed5-b458-e8c9864a1e19" (UID: "8dc1a722-c545-4ed5-b458-e8c9864a1e19"). InnerVolumeSpecName "kube-api-access-cjn56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.487805 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-scripts" (OuterVolumeSpecName: "scripts") pod "8dc1a722-c545-4ed5-b458-e8c9864a1e19" (UID: "8dc1a722-c545-4ed5-b458-e8c9864a1e19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.505727 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dc1a722-c545-4ed5-b458-e8c9864a1e19" (UID: "8dc1a722-c545-4ed5-b458-e8c9864a1e19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.509033 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-config-data" (OuterVolumeSpecName: "config-data") pod "8dc1a722-c545-4ed5-b458-e8c9864a1e19" (UID: "8dc1a722-c545-4ed5-b458-e8c9864a1e19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.580529 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjn56\" (UniqueName: \"kubernetes.io/projected/8dc1a722-c545-4ed5-b458-e8c9864a1e19-kube-api-access-cjn56\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.580568 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.580582 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:00 crc kubenswrapper[4789]: I1216 08:27:00.580592 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc1a722-c545-4ed5-b458-e8c9864a1e19-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:01 crc kubenswrapper[4789]: I1216 08:27:01.028105 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bv2lw" event={"ID":"8dc1a722-c545-4ed5-b458-e8c9864a1e19","Type":"ContainerDied","Data":"9ea64d403f3d9c0172880f65e43e043f07b75ac7b79052e7b18c5a0755559117"} Dec 16 08:27:01 crc kubenswrapper[4789]: I1216 08:27:01.029069 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea64d403f3d9c0172880f65e43e043f07b75ac7b79052e7b18c5a0755559117" Dec 16 08:27:01 crc kubenswrapper[4789]: I1216 08:27:01.028158 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bv2lw" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.571794 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 16 08:27:04 crc kubenswrapper[4789]: E1216 08:27:04.574008 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc1a722-c545-4ed5-b458-e8c9864a1e19" containerName="aodh-db-sync" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.574140 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc1a722-c545-4ed5-b458-e8c9864a1e19" containerName="aodh-db-sync" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.574605 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc1a722-c545-4ed5-b458-e8c9864a1e19" containerName="aodh-db-sync" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.579210 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.586046 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5tmwz" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.586202 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.586608 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.596761 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.662847 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-config-data\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.662894 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-scripts\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.662965 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.663010 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnbnd\" (UniqueName: \"kubernetes.io/projected/6a6f0826-3133-40ba-9139-b5cec2b92a29-kube-api-access-vnbnd\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.764768 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-config-data\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.764815 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-scripts\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.764868 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.764925 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnbnd\" (UniqueName: \"kubernetes.io/projected/6a6f0826-3133-40ba-9139-b5cec2b92a29-kube-api-access-vnbnd\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.784572 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.785251 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-scripts\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.796804 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a6f0826-3133-40ba-9139-b5cec2b92a29-config-data\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.809458 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnbnd\" (UniqueName: \"kubernetes.io/projected/6a6f0826-3133-40ba-9139-b5cec2b92a29-kube-api-access-vnbnd\") pod \"aodh-0\" (UID: \"6a6f0826-3133-40ba-9139-b5cec2b92a29\") " pod="openstack/aodh-0" Dec 16 08:27:04 crc kubenswrapper[4789]: I1216 08:27:04.910024 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 16 08:27:05 crc kubenswrapper[4789]: W1216 08:27:05.356401 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a6f0826_3133_40ba_9139_b5cec2b92a29.slice/crio-4fef2c2e7379b62fc0e92f30417d7f5149339469e87bef3c99e4add8c403dfeb WatchSource:0}: Error finding container 4fef2c2e7379b62fc0e92f30417d7f5149339469e87bef3c99e4add8c403dfeb: Status 404 returned error can't find the container with id 4fef2c2e7379b62fc0e92f30417d7f5149339469e87bef3c99e4add8c403dfeb Dec 16 08:27:05 crc kubenswrapper[4789]: I1216 08:27:05.360206 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 16 08:27:06 crc kubenswrapper[4789]: I1216 08:27:06.072374 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6a6f0826-3133-40ba-9139-b5cec2b92a29","Type":"ContainerStarted","Data":"3bc82c955daf352bf4d7346bd910a4ce513a66d916e21b4cf1cfa9c0486981f5"} Dec 16 08:27:06 crc kubenswrapper[4789]: I1216 08:27:06.072696 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6a6f0826-3133-40ba-9139-b5cec2b92a29","Type":"ContainerStarted","Data":"4fef2c2e7379b62fc0e92f30417d7f5149339469e87bef3c99e4add8c403dfeb"} Dec 16 08:27:06 crc kubenswrapper[4789]: I1216 08:27:06.687516 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:06 crc kubenswrapper[4789]: I1216 08:27:06.688448 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-central-agent" containerID="cri-o://1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2" gracePeriod=30 Dec 16 08:27:06 crc kubenswrapper[4789]: I1216 08:27:06.688640 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-notification-agent" containerID="cri-o://ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005" gracePeriod=30 Dec 16 08:27:06 crc kubenswrapper[4789]: I1216 08:27:06.688626 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="sg-core" containerID="cri-o://1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb" gracePeriod=30 Dec 16 08:27:06 crc kubenswrapper[4789]: I1216 08:27:06.688669 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="proxy-httpd" containerID="cri-o://5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805" gracePeriod=30 Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.090218 4789 generic.go:334] "Generic (PLEG): container finished" podID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerID="5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805" exitCode=0 Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.090256 4789 generic.go:334] "Generic (PLEG): container finished" podID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerID="1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb" exitCode=2 Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.090274 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerDied","Data":"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805"} Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.090303 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerDied","Data":"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb"} Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.824769 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.936360 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-combined-ca-bundle\") pod \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.936481 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-config-data\") pod \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.937325 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-sg-core-conf-yaml\") pod \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.937373 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-run-httpd\") pod \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.937493 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-762z5\" (UniqueName: \"kubernetes.io/projected/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-kube-api-access-762z5\") pod \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.937576 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-log-httpd\") pod \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.937602 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-scripts\") pod \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\" (UID: \"b19174f6-911d-4d14-a9b1-c7e6c555f4ac\") " Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.941347 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-scripts" (OuterVolumeSpecName: "scripts") pod "b19174f6-911d-4d14-a9b1-c7e6c555f4ac" (UID: "b19174f6-911d-4d14-a9b1-c7e6c555f4ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.941751 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b19174f6-911d-4d14-a9b1-c7e6c555f4ac" (UID: "b19174f6-911d-4d14-a9b1-c7e6c555f4ac"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.941883 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-kube-api-access-762z5" (OuterVolumeSpecName: "kube-api-access-762z5") pod "b19174f6-911d-4d14-a9b1-c7e6c555f4ac" (UID: "b19174f6-911d-4d14-a9b1-c7e6c555f4ac"). InnerVolumeSpecName "kube-api-access-762z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.942549 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b19174f6-911d-4d14-a9b1-c7e6c555f4ac" (UID: "b19174f6-911d-4d14-a9b1-c7e6c555f4ac"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:27:07 crc kubenswrapper[4789]: I1216 08:27:07.986200 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b19174f6-911d-4d14-a9b1-c7e6c555f4ac" (UID: "b19174f6-911d-4d14-a9b1-c7e6c555f4ac"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.040019 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-762z5\" (UniqueName: \"kubernetes.io/projected/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-kube-api-access-762z5\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.040049 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.040061 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.040072 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.040085 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.058585 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-config-data" (OuterVolumeSpecName: "config-data") pod "b19174f6-911d-4d14-a9b1-c7e6c555f4ac" (UID: "b19174f6-911d-4d14-a9b1-c7e6c555f4ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.070520 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b19174f6-911d-4d14-a9b1-c7e6c555f4ac" (UID: "b19174f6-911d-4d14-a9b1-c7e6c555f4ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.101200 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6a6f0826-3133-40ba-9139-b5cec2b92a29","Type":"ContainerStarted","Data":"84341e3b7344fbd2e3f448f34a95394462ed975926c1a2791ea7db1d3437a0cd"} Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.103636 4789 generic.go:334] "Generic (PLEG): container finished" podID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerID="ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005" exitCode=0 Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.103659 4789 generic.go:334] "Generic (PLEG): container finished" podID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerID="1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2" exitCode=0 Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.103676 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerDied","Data":"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005"} Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.103695 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerDied","Data":"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2"} Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.103708 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b19174f6-911d-4d14-a9b1-c7e6c555f4ac","Type":"ContainerDied","Data":"7086d2765987ff1a73a273d38a0987adcfb085174d0a86cd40b3571bf6d445e4"} Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.103724 4789 scope.go:117] "RemoveContainer" containerID="5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.103705 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.127330 4789 scope.go:117] "RemoveContainer" containerID="1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.141950 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.141988 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b19174f6-911d-4d14-a9b1-c7e6c555f4ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.512264 4789 scope.go:117] "RemoveContainer" containerID="ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.559292 4789 scope.go:117] "RemoveContainer" containerID="1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.636060 4789 scope.go:117] "RemoveContainer" containerID="5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805" Dec 16 08:27:08 crc kubenswrapper[4789]: E1216 08:27:08.636607 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805\": container with ID starting with 5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805 not found: ID does not exist" containerID="5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.636646 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805"} err="failed to get container status \"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805\": rpc error: code = NotFound desc = could not find container \"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805\": container with ID starting with 5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805 not found: ID does not exist" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.636668 4789 scope.go:117] "RemoveContainer" containerID="1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb" Dec 16 08:27:08 crc kubenswrapper[4789]: E1216 08:27:08.638615 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb\": container with ID starting with 1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb not found: ID does not exist" containerID="1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.638637 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb"} err="failed to get container status \"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb\": rpc error: code = NotFound desc = could not find container \"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb\": container with ID starting with 1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb not found: ID does not exist" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.638650 4789 scope.go:117] "RemoveContainer" containerID="ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005" Dec 16 08:27:08 crc kubenswrapper[4789]: E1216 08:27:08.639097 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005\": container with ID starting with ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005 not found: ID does not exist" containerID="ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.639148 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005"} err="failed to get container status \"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005\": rpc error: code = NotFound desc = could not find container \"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005\": container with ID starting with ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005 not found: ID does not exist" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.639180 4789 scope.go:117] "RemoveContainer" containerID="1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2" Dec 16 08:27:08 crc kubenswrapper[4789]: E1216 08:27:08.639620 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2\": container with ID starting with 1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2 not found: ID does not exist" containerID="1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.639644 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2"} err="failed to get container status \"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2\": rpc error: code = NotFound desc = could not find container \"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2\": container with ID starting with 1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2 not found: ID does not exist" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.639680 4789 scope.go:117] "RemoveContainer" containerID="5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.640014 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805"} err="failed to get container status \"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805\": rpc error: code = NotFound desc = could not find container \"5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805\": container with ID starting with 5fa15137eba635de4e4ecdb97519b4bcc216151756f7765688d26c4c5a19b805 not found: ID does not exist" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.640039 4789 scope.go:117] "RemoveContainer" containerID="1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.640389 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb"} err="failed to get container status \"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb\": rpc error: code = NotFound desc = could not find container \"1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb\": container with ID starting with 1dcef11ccb42012e82e5564adc7ca77668e9650788afc5c4cc2e33d109b799fb not found: ID does not exist" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.640412 4789 scope.go:117] "RemoveContainer" containerID="ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.641090 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005"} err="failed to get container status \"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005\": rpc error: code = NotFound desc = could not find container \"ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005\": container with ID starting with ab73b10ba64e1709b444bb7dff8073262fe20da0506cc75d297f1336deb56005 not found: ID does not exist" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.643131 4789 scope.go:117] "RemoveContainer" containerID="1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2" Dec 16 08:27:08 crc kubenswrapper[4789]: I1216 08:27:08.644535 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2"} err="failed to get container status \"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2\": rpc error: code = NotFound desc = could not find container \"1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2\": container with ID starting with 1b6882ec87e22cd1828e6a3857d9f1228873b1441b4d132ade61e4b197a14eb2 not found: ID does not exist" Dec 16 08:27:09 crc kubenswrapper[4789]: I1216 08:27:09.112751 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6a6f0826-3133-40ba-9139-b5cec2b92a29","Type":"ContainerStarted","Data":"3c0c651aabd9f33645fdf011fb92df599b3692ea5227055733a869c7d953cdf8"} Dec 16 08:27:10 crc kubenswrapper[4789]: I1216 08:27:10.123044 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6a6f0826-3133-40ba-9139-b5cec2b92a29","Type":"ContainerStarted","Data":"acb8ef98ae4c2277530a244b3e6b94c293627b7ec42e51411dcc5f151282adb7"} Dec 16 08:27:10 crc kubenswrapper[4789]: I1216 08:27:10.147606 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.777859114 podStartE2EDuration="6.147588854s" podCreationTimestamp="2025-12-16 08:27:04 +0000 UTC" firstStartedPulling="2025-12-16 08:27:05.359013945 +0000 UTC m=+5763.620901574" lastFinishedPulling="2025-12-16 08:27:09.728743685 +0000 UTC m=+5767.990631314" observedRunningTime="2025-12-16 08:27:10.144299473 +0000 UTC m=+5768.406187092" watchObservedRunningTime="2025-12-16 08:27:10.147588854 +0000 UTC m=+5768.409476483" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.012974 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-gnbxf"] Dec 16 08:27:15 crc kubenswrapper[4789]: E1216 08:27:15.013817 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-central-agent" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.013831 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-central-agent" Dec 16 08:27:15 crc kubenswrapper[4789]: E1216 08:27:15.013847 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="proxy-httpd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.013853 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="proxy-httpd" Dec 16 08:27:15 crc kubenswrapper[4789]: E1216 08:27:15.013880 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="sg-core" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.013886 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="sg-core" Dec 16 08:27:15 crc kubenswrapper[4789]: E1216 08:27:15.013906 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-notification-agent" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.013929 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-notification-agent" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.014093 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="proxy-httpd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.014115 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-central-agent" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.014123 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="sg-core" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.014141 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" containerName="ceilometer-notification-agent" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.014850 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.024082 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-gnbxf"] Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.082886 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-operator-scripts\") pod \"manila-db-create-gnbxf\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.083093 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrg7\" (UniqueName: \"kubernetes.io/projected/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-kube-api-access-cjrg7\") pod \"manila-db-create-gnbxf\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.126963 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-d8d3-account-create-update-646vd"] Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.128215 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.130437 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.137839 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-d8d3-account-create-update-646vd"] Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.184969 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrg7\" (UniqueName: \"kubernetes.io/projected/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-kube-api-access-cjrg7\") pod \"manila-db-create-gnbxf\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.185122 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8wm5\" (UniqueName: \"kubernetes.io/projected/117cb35b-e746-480c-bec0-65c7307f3dc2-kube-api-access-n8wm5\") pod \"manila-d8d3-account-create-update-646vd\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.185161 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117cb35b-e746-480c-bec0-65c7307f3dc2-operator-scripts\") pod \"manila-d8d3-account-create-update-646vd\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.185215 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-operator-scripts\") pod \"manila-db-create-gnbxf\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.186483 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-operator-scripts\") pod \"manila-db-create-gnbxf\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.206591 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrg7\" (UniqueName: \"kubernetes.io/projected/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-kube-api-access-cjrg7\") pod \"manila-db-create-gnbxf\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.286469 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8wm5\" (UniqueName: \"kubernetes.io/projected/117cb35b-e746-480c-bec0-65c7307f3dc2-kube-api-access-n8wm5\") pod \"manila-d8d3-account-create-update-646vd\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.286511 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117cb35b-e746-480c-bec0-65c7307f3dc2-operator-scripts\") pod \"manila-d8d3-account-create-update-646vd\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.287386 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117cb35b-e746-480c-bec0-65c7307f3dc2-operator-scripts\") pod \"manila-d8d3-account-create-update-646vd\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.304859 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8wm5\" (UniqueName: \"kubernetes.io/projected/117cb35b-e746-480c-bec0-65c7307f3dc2-kube-api-access-n8wm5\") pod \"manila-d8d3-account-create-update-646vd\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.346137 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.446578 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.833514 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-gnbxf"] Dec 16 08:27:15 crc kubenswrapper[4789]: I1216 08:27:15.981525 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-d8d3-account-create-update-646vd"] Dec 16 08:27:15 crc kubenswrapper[4789]: W1216 08:27:15.990102 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117cb35b_e746_480c_bec0_65c7307f3dc2.slice/crio-5c3f8223739ca6cbc73e4a7d4d38d80e12140ba8d7f38fe5f2704d028eabe95f WatchSource:0}: Error finding container 5c3f8223739ca6cbc73e4a7d4d38d80e12140ba8d7f38fe5f2704d028eabe95f: Status 404 returned error can't find the container with id 5c3f8223739ca6cbc73e4a7d4d38d80e12140ba8d7f38fe5f2704d028eabe95f Dec 16 08:27:16 crc kubenswrapper[4789]: I1216 08:27:16.175951 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d8d3-account-create-update-646vd" event={"ID":"117cb35b-e746-480c-bec0-65c7307f3dc2","Type":"ContainerStarted","Data":"970e38dd654fd25c83865011e1140c028e03554f3fe0f84c3365a6dfd3c1498b"} Dec 16 08:27:16 crc kubenswrapper[4789]: I1216 08:27:16.176509 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d8d3-account-create-update-646vd" event={"ID":"117cb35b-e746-480c-bec0-65c7307f3dc2","Type":"ContainerStarted","Data":"5c3f8223739ca6cbc73e4a7d4d38d80e12140ba8d7f38fe5f2704d028eabe95f"} Dec 16 08:27:16 crc kubenswrapper[4789]: I1216 08:27:16.178707 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gnbxf" event={"ID":"bde5c8bd-5cc0-4f13-acb8-efe1c8560202","Type":"ContainerStarted","Data":"6627f3388c5adde55de17df45e4839aa21d7d5b9c2f9f471a05c7e4f258a1282"} Dec 16 08:27:16 crc kubenswrapper[4789]: I1216 08:27:16.178736 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gnbxf" event={"ID":"bde5c8bd-5cc0-4f13-acb8-efe1c8560202","Type":"ContainerStarted","Data":"bf91f41461c7e2e97fcae935576dd8ef7d292e448fab66683bceaa8fc5d0d6f1"} Dec 16 08:27:16 crc kubenswrapper[4789]: I1216 08:27:16.195001 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-d8d3-account-create-update-646vd" podStartSLOduration=1.1949365730000001 podStartE2EDuration="1.194936573s" podCreationTimestamp="2025-12-16 08:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:27:16.188341232 +0000 UTC m=+5774.450228861" watchObservedRunningTime="2025-12-16 08:27:16.194936573 +0000 UTC m=+5774.456824212" Dec 16 08:27:16 crc kubenswrapper[4789]: I1216 08:27:16.205034 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-gnbxf" podStartSLOduration=2.205014389 podStartE2EDuration="2.205014389s" podCreationTimestamp="2025-12-16 08:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:27:16.202863057 +0000 UTC m=+5774.464750696" watchObservedRunningTime="2025-12-16 08:27:16.205014389 +0000 UTC m=+5774.466902018" Dec 16 08:27:16 crc kubenswrapper[4789]: E1216 08:27:16.495302 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117cb35b_e746_480c_bec0_65c7307f3dc2.slice/crio-conmon-970e38dd654fd25c83865011e1140c028e03554f3fe0f84c3365a6dfd3c1498b.scope\": RecentStats: unable to find data in memory cache]" Dec 16 08:27:17 crc kubenswrapper[4789]: I1216 08:27:17.187856 4789 generic.go:334] "Generic (PLEG): container finished" podID="117cb35b-e746-480c-bec0-65c7307f3dc2" containerID="970e38dd654fd25c83865011e1140c028e03554f3fe0f84c3365a6dfd3c1498b" exitCode=0 Dec 16 08:27:17 crc kubenswrapper[4789]: I1216 08:27:17.187943 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d8d3-account-create-update-646vd" event={"ID":"117cb35b-e746-480c-bec0-65c7307f3dc2","Type":"ContainerDied","Data":"970e38dd654fd25c83865011e1140c028e03554f3fe0f84c3365a6dfd3c1498b"} Dec 16 08:27:17 crc kubenswrapper[4789]: I1216 08:27:17.189593 4789 generic.go:334] "Generic (PLEG): container finished" podID="bde5c8bd-5cc0-4f13-acb8-efe1c8560202" containerID="6627f3388c5adde55de17df45e4839aa21d7d5b9c2f9f471a05c7e4f258a1282" exitCode=0 Dec 16 08:27:17 crc kubenswrapper[4789]: I1216 08:27:17.189625 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gnbxf" event={"ID":"bde5c8bd-5cc0-4f13-acb8-efe1c8560202","Type":"ContainerDied","Data":"6627f3388c5adde55de17df45e4839aa21d7d5b9c2f9f471a05c7e4f258a1282"} Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.732213 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.740807 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.768588 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-operator-scripts\") pod \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.768663 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8wm5\" (UniqueName: \"kubernetes.io/projected/117cb35b-e746-480c-bec0-65c7307f3dc2-kube-api-access-n8wm5\") pod \"117cb35b-e746-480c-bec0-65c7307f3dc2\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.768715 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117cb35b-e746-480c-bec0-65c7307f3dc2-operator-scripts\") pod \"117cb35b-e746-480c-bec0-65c7307f3dc2\" (UID: \"117cb35b-e746-480c-bec0-65c7307f3dc2\") " Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.768759 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjrg7\" (UniqueName: \"kubernetes.io/projected/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-kube-api-access-cjrg7\") pod \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\" (UID: \"bde5c8bd-5cc0-4f13-acb8-efe1c8560202\") " Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.771232 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/117cb35b-e746-480c-bec0-65c7307f3dc2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "117cb35b-e746-480c-bec0-65c7307f3dc2" (UID: "117cb35b-e746-480c-bec0-65c7307f3dc2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.771679 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bde5c8bd-5cc0-4f13-acb8-efe1c8560202" (UID: "bde5c8bd-5cc0-4f13-acb8-efe1c8560202"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.775936 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117cb35b-e746-480c-bec0-65c7307f3dc2-kube-api-access-n8wm5" (OuterVolumeSpecName: "kube-api-access-n8wm5") pod "117cb35b-e746-480c-bec0-65c7307f3dc2" (UID: "117cb35b-e746-480c-bec0-65c7307f3dc2"). InnerVolumeSpecName "kube-api-access-n8wm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.785533 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-kube-api-access-cjrg7" (OuterVolumeSpecName: "kube-api-access-cjrg7") pod "bde5c8bd-5cc0-4f13-acb8-efe1c8560202" (UID: "bde5c8bd-5cc0-4f13-acb8-efe1c8560202"). InnerVolumeSpecName "kube-api-access-cjrg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.871205 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/117cb35b-e746-480c-bec0-65c7307f3dc2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.871234 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjrg7\" (UniqueName: \"kubernetes.io/projected/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-kube-api-access-cjrg7\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.871244 4789 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde5c8bd-5cc0-4f13-acb8-efe1c8560202-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:18 crc kubenswrapper[4789]: I1216 08:27:18.871255 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8wm5\" (UniqueName: \"kubernetes.io/projected/117cb35b-e746-480c-bec0-65c7307f3dc2-kube-api-access-n8wm5\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:19 crc kubenswrapper[4789]: I1216 08:27:19.206631 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-d8d3-account-create-update-646vd" event={"ID":"117cb35b-e746-480c-bec0-65c7307f3dc2","Type":"ContainerDied","Data":"5c3f8223739ca6cbc73e4a7d4d38d80e12140ba8d7f38fe5f2704d028eabe95f"} Dec 16 08:27:19 crc kubenswrapper[4789]: I1216 08:27:19.206959 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c3f8223739ca6cbc73e4a7d4d38d80e12140ba8d7f38fe5f2704d028eabe95f" Dec 16 08:27:19 crc kubenswrapper[4789]: I1216 08:27:19.206639 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-d8d3-account-create-update-646vd" Dec 16 08:27:19 crc kubenswrapper[4789]: I1216 08:27:19.208154 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gnbxf" event={"ID":"bde5c8bd-5cc0-4f13-acb8-efe1c8560202","Type":"ContainerDied","Data":"bf91f41461c7e2e97fcae935576dd8ef7d292e448fab66683bceaa8fc5d0d6f1"} Dec 16 08:27:19 crc kubenswrapper[4789]: I1216 08:27:19.208192 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf91f41461c7e2e97fcae935576dd8ef7d292e448fab66683bceaa8fc5d0d6f1" Dec 16 08:27:19 crc kubenswrapper[4789]: I1216 08:27:19.208390 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gnbxf" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.448564 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-xf5wn"] Dec 16 08:27:20 crc kubenswrapper[4789]: E1216 08:27:20.448988 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117cb35b-e746-480c-bec0-65c7307f3dc2" containerName="mariadb-account-create-update" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.449004 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="117cb35b-e746-480c-bec0-65c7307f3dc2" containerName="mariadb-account-create-update" Dec 16 08:27:20 crc kubenswrapper[4789]: E1216 08:27:20.449022 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde5c8bd-5cc0-4f13-acb8-efe1c8560202" containerName="mariadb-database-create" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.449028 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde5c8bd-5cc0-4f13-acb8-efe1c8560202" containerName="mariadb-database-create" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.449233 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde5c8bd-5cc0-4f13-acb8-efe1c8560202" containerName="mariadb-database-create" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.449247 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="117cb35b-e746-480c-bec0-65c7307f3dc2" containerName="mariadb-account-create-update" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.449930 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.452819 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-gl4rx" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.453035 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.461438 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-xf5wn"] Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.500618 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-combined-ca-bundle\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.500851 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-job-config-data\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.500881 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-config-data\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.501027 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psprx\" (UniqueName: \"kubernetes.io/projected/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-kube-api-access-psprx\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.602246 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psprx\" (UniqueName: \"kubernetes.io/projected/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-kube-api-access-psprx\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.602549 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-combined-ca-bundle\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.602643 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-job-config-data\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.602714 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-config-data\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.607546 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-combined-ca-bundle\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.607808 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-job-config-data\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.608220 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-config-data\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.617327 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psprx\" (UniqueName: \"kubernetes.io/projected/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-kube-api-access-psprx\") pod \"manila-db-sync-xf5wn\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:20 crc kubenswrapper[4789]: I1216 08:27:20.769701 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:21 crc kubenswrapper[4789]: I1216 08:27:21.314410 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-xf5wn"] Dec 16 08:27:21 crc kubenswrapper[4789]: W1216 08:27:21.324494 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a044ea9_86b0_4a61_b6b9_55f3663f1d3a.slice/crio-b288ce928c333457df756c562bd82f118a76480be446f946eaf0358d42ffd020 WatchSource:0}: Error finding container b288ce928c333457df756c562bd82f118a76480be446f946eaf0358d42ffd020: Status 404 returned error can't find the container with id b288ce928c333457df756c562bd82f118a76480be446f946eaf0358d42ffd020 Dec 16 08:27:21 crc kubenswrapper[4789]: I1216 08:27:21.927879 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:27:21 crc kubenswrapper[4789]: I1216 08:27:21.927990 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:27:22 crc kubenswrapper[4789]: I1216 08:27:22.244602 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xf5wn" event={"ID":"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a","Type":"ContainerStarted","Data":"b288ce928c333457df756c562bd82f118a76480be446f946eaf0358d42ffd020"} Dec 16 08:27:28 crc kubenswrapper[4789]: I1216 08:27:28.312421 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xf5wn" event={"ID":"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a","Type":"ContainerStarted","Data":"f9cba574f842f45fefa861d72dbe686297c23958516ac88a28f8366d0364219a"} Dec 16 08:27:28 crc kubenswrapper[4789]: I1216 08:27:28.337233 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-xf5wn" podStartSLOduration=2.560377049 podStartE2EDuration="8.337212376s" podCreationTimestamp="2025-12-16 08:27:20 +0000 UTC" firstStartedPulling="2025-12-16 08:27:21.327097711 +0000 UTC m=+5779.588985350" lastFinishedPulling="2025-12-16 08:27:27.103933048 +0000 UTC m=+5785.365820677" observedRunningTime="2025-12-16 08:27:28.334199362 +0000 UTC m=+5786.596086991" watchObservedRunningTime="2025-12-16 08:27:28.337212376 +0000 UTC m=+5786.599100005" Dec 16 08:27:30 crc kubenswrapper[4789]: I1216 08:27:30.328870 4789 generic.go:334] "Generic (PLEG): container finished" podID="1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" containerID="f9cba574f842f45fefa861d72dbe686297c23958516ac88a28f8366d0364219a" exitCode=0 Dec 16 08:27:30 crc kubenswrapper[4789]: I1216 08:27:30.328953 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xf5wn" event={"ID":"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a","Type":"ContainerDied","Data":"f9cba574f842f45fefa861d72dbe686297c23958516ac88a28f8366d0364219a"} Dec 16 08:27:31 crc kubenswrapper[4789]: I1216 08:27:31.834856 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:31 crc kubenswrapper[4789]: I1216 08:27:31.934616 4789 scope.go:117] "RemoveContainer" containerID="687251f59a32c12d11eb156d5811c1201ccc61fb3cded4b501886f54720185b3" Dec 16 08:27:31 crc kubenswrapper[4789]: I1216 08:27:31.962755 4789 scope.go:117] "RemoveContainer" containerID="36444dd3b3eb4e4d30df9516b4f9a42484f4c22695eaa6d5753167be5037a3cc" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.013129 4789 scope.go:117] "RemoveContainer" containerID="67a695f9157b0862a4d43a2b8b4fcba65384bcf7358897cbe3bd13363b23ef6e" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.028098 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psprx\" (UniqueName: \"kubernetes.io/projected/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-kube-api-access-psprx\") pod \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.028216 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-combined-ca-bundle\") pod \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.028337 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-job-config-data\") pod \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.028366 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-config-data\") pod \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\" (UID: \"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a\") " Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.034146 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" (UID: "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.034212 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-kube-api-access-psprx" (OuterVolumeSpecName: "kube-api-access-psprx") pod "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" (UID: "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a"). InnerVolumeSpecName "kube-api-access-psprx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.039165 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-config-data" (OuterVolumeSpecName: "config-data") pod "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" (UID: "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.062401 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" (UID: "1a044ea9-86b0-4a61-b6b9-55f3663f1d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.130796 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psprx\" (UniqueName: \"kubernetes.io/projected/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-kube-api-access-psprx\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.130821 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.130832 4789 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.130841 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.347178 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xf5wn" event={"ID":"1a044ea9-86b0-4a61-b6b9-55f3663f1d3a","Type":"ContainerDied","Data":"b288ce928c333457df756c562bd82f118a76480be446f946eaf0358d42ffd020"} Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.347216 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xf5wn" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.347228 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b288ce928c333457df756c562bd82f118a76480be446f946eaf0358d42ffd020" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.736584 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 16 08:27:32 crc kubenswrapper[4789]: E1216 08:27:32.737411 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" containerName="manila-db-sync" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.737434 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" containerName="manila-db-sync" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.737672 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" containerName="manila-db-sync" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.738931 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.749525 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.749651 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.749529 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.749878 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-gl4rx" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.772899 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.799077 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.801151 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.807014 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.817858 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.871221 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f770399-9fac-4410-b6be-ecb2830512c5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.871344 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-scripts\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.871621 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8rw5\" (UniqueName: \"kubernetes.io/projected/7f770399-9fac-4410-b6be-ecb2830512c5-kube-api-access-c8rw5\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.871696 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.871739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-config-data\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.871871 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.887990 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895d86687-pbz9f"] Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.891318 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.911368 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895d86687-pbz9f"] Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975585 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975628 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f770399-9fac-4410-b6be-ecb2830512c5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975659 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-scripts\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975681 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-scripts\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975734 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-config-data\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975755 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a00d6fa-713d-4297-b5cc-7ca06b736d65-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975807 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnm5v\" (UniqueName: \"kubernetes.io/projected/7a00d6fa-713d-4297-b5cc-7ca06b736d65-kube-api-access-jnm5v\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975837 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8rw5\" (UniqueName: \"kubernetes.io/projected/7f770399-9fac-4410-b6be-ecb2830512c5-kube-api-access-c8rw5\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975872 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975894 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-config-data\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975933 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/7a00d6fa-713d-4297-b5cc-7ca06b736d65-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975949 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975977 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.975994 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a00d6fa-713d-4297-b5cc-7ca06b736d65-ceph\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.977055 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f770399-9fac-4410-b6be-ecb2830512c5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.987686 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-scripts\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.994722 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:32 crc kubenswrapper[4789]: I1216 08:27:32.998162 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-config-data\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.001049 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f770399-9fac-4410-b6be-ecb2830512c5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.027589 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8rw5\" (UniqueName: \"kubernetes.io/projected/7f770399-9fac-4410-b6be-ecb2830512c5-kube-api-access-c8rw5\") pod \"manila-scheduler-0\" (UID: \"7f770399-9fac-4410-b6be-ecb2830512c5\") " pod="openstack/manila-scheduler-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080004 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a00d6fa-713d-4297-b5cc-7ca06b736d65-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080095 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnm5v\" (UniqueName: \"kubernetes.io/projected/7a00d6fa-713d-4297-b5cc-7ca06b736d65-kube-api-access-jnm5v\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080120 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-config\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080155 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-dns-svc\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080198 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/7a00d6fa-713d-4297-b5cc-7ca06b736d65-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080218 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080254 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080278 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a00d6fa-713d-4297-b5cc-7ca06b736d65-ceph\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080329 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-sb\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080358 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdnd8\" (UniqueName: \"kubernetes.io/projected/647d85e5-d8f3-466d-b348-52bfd9196717-kube-api-access-jdnd8\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080385 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-nb\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080411 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-scripts\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.080470 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-config-data\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.084465 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a00d6fa-713d-4297-b5cc-7ca06b736d65-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.085084 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/7a00d6fa-713d-4297-b5cc-7ca06b736d65-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.089027 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-config-data\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.095312 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.096962 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-scripts\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.118387 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a00d6fa-713d-4297-b5cc-7ca06b736d65-ceph\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.118949 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a00d6fa-713d-4297-b5cc-7ca06b736d65-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.127479 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.129533 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnm5v\" (UniqueName: \"kubernetes.io/projected/7a00d6fa-713d-4297-b5cc-7ca06b736d65-kube-api-access-jnm5v\") pod \"manila-share-share1-0\" (UID: \"7a00d6fa-713d-4297-b5cc-7ca06b736d65\") " pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.157824 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.183132 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-sb\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.183221 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdnd8\" (UniqueName: \"kubernetes.io/projected/647d85e5-d8f3-466d-b348-52bfd9196717-kube-api-access-jdnd8\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.183251 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-nb\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.183425 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-config\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.183463 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-dns-svc\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.184654 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-dns-svc\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.185194 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-sb\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.185405 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-nb\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.185786 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-config\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.209797 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.216972 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.222674 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdnd8\" (UniqueName: \"kubernetes.io/projected/647d85e5-d8f3-466d-b348-52bfd9196717-kube-api-access-jdnd8\") pod \"dnsmasq-dns-895d86687-pbz9f\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.224154 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.230718 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.240259 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.387652 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwrc\" (UniqueName: \"kubernetes.io/projected/c7fe2a8b-685e-46d1-8890-7f4a0d752135-kube-api-access-ncwrc\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.387738 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fe2a8b-685e-46d1-8890-7f4a0d752135-logs\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.387797 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-config-data-custom\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.387968 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7fe2a8b-685e-46d1-8890-7f4a0d752135-etc-machine-id\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.388003 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.388115 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-scripts\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.388369 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-config-data\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.490021 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-config-data\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.490125 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwrc\" (UniqueName: \"kubernetes.io/projected/c7fe2a8b-685e-46d1-8890-7f4a0d752135-kube-api-access-ncwrc\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.490157 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fe2a8b-685e-46d1-8890-7f4a0d752135-logs\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.490173 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-config-data-custom\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.490228 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7fe2a8b-685e-46d1-8890-7f4a0d752135-etc-machine-id\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.490249 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.490274 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-scripts\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.494954 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7fe2a8b-685e-46d1-8890-7f4a0d752135-etc-machine-id\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.497988 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fe2a8b-685e-46d1-8890-7f4a0d752135-logs\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.501275 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.511479 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-scripts\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.516493 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-config-data\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.516997 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7fe2a8b-685e-46d1-8890-7f4a0d752135-config-data-custom\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.531381 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwrc\" (UniqueName: \"kubernetes.io/projected/c7fe2a8b-685e-46d1-8890-7f4a0d752135-kube-api-access-ncwrc\") pod \"manila-api-0\" (UID: \"c7fe2a8b-685e-46d1-8890-7f4a0d752135\") " pod="openstack/manila-api-0" Dec 16 08:27:33 crc kubenswrapper[4789]: I1216 08:27:33.553543 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 16 08:27:34 crc kubenswrapper[4789]: I1216 08:27:34.228343 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 16 08:27:34 crc kubenswrapper[4789]: W1216 08:27:34.246639 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod647d85e5_d8f3_466d_b348_52bfd9196717.slice/crio-482f93eabc17088c616f0385ea1304bde2aeef8b8ceea3251e0c5b43f8ce1c68 WatchSource:0}: Error finding container 482f93eabc17088c616f0385ea1304bde2aeef8b8ceea3251e0c5b43f8ce1c68: Status 404 returned error can't find the container with id 482f93eabc17088c616f0385ea1304bde2aeef8b8ceea3251e0c5b43f8ce1c68 Dec 16 08:27:34 crc kubenswrapper[4789]: I1216 08:27:34.255688 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895d86687-pbz9f"] Dec 16 08:27:34 crc kubenswrapper[4789]: I1216 08:27:34.348432 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 16 08:27:34 crc kubenswrapper[4789]: W1216 08:27:34.355153 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a00d6fa_713d_4297_b5cc_7ca06b736d65.slice/crio-6841cef86dfb1a17efdcad923d3d64c13665b6458afa48012ddfb081bc4e8d41 WatchSource:0}: Error finding container 6841cef86dfb1a17efdcad923d3d64c13665b6458afa48012ddfb081bc4e8d41: Status 404 returned error can't find the container with id 6841cef86dfb1a17efdcad923d3d64c13665b6458afa48012ddfb081bc4e8d41 Dec 16 08:27:34 crc kubenswrapper[4789]: I1216 08:27:34.369708 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7f770399-9fac-4410-b6be-ecb2830512c5","Type":"ContainerStarted","Data":"338b2c87de9849ddaa326364d0c6899e8959b873cb4a93d02be201a7e3798095"} Dec 16 08:27:34 crc kubenswrapper[4789]: I1216 08:27:34.372048 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895d86687-pbz9f" event={"ID":"647d85e5-d8f3-466d-b348-52bfd9196717","Type":"ContainerStarted","Data":"482f93eabc17088c616f0385ea1304bde2aeef8b8ceea3251e0c5b43f8ce1c68"} Dec 16 08:27:34 crc kubenswrapper[4789]: I1216 08:27:34.373631 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"7a00d6fa-713d-4297-b5cc-7ca06b736d65","Type":"ContainerStarted","Data":"6841cef86dfb1a17efdcad923d3d64c13665b6458afa48012ddfb081bc4e8d41"} Dec 16 08:27:34 crc kubenswrapper[4789]: I1216 08:27:34.414212 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 16 08:27:34 crc kubenswrapper[4789]: W1216 08:27:34.421724 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7fe2a8b_685e_46d1_8890_7f4a0d752135.slice/crio-7faedb8e83086ebfb62b972c774b07edaa13f265030d6ac9697ade698df28278 WatchSource:0}: Error finding container 7faedb8e83086ebfb62b972c774b07edaa13f265030d6ac9697ade698df28278: Status 404 returned error can't find the container with id 7faedb8e83086ebfb62b972c774b07edaa13f265030d6ac9697ade698df28278 Dec 16 08:27:35 crc kubenswrapper[4789]: I1216 08:27:35.395074 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7f770399-9fac-4410-b6be-ecb2830512c5","Type":"ContainerStarted","Data":"0d48c7c3fb92ef09b8ba5101afb37ce8cf2e217d85cc29ac207a4b2c0ed6e6f9"} Dec 16 08:27:35 crc kubenswrapper[4789]: I1216 08:27:35.411943 4789 generic.go:334] "Generic (PLEG): container finished" podID="647d85e5-d8f3-466d-b348-52bfd9196717" containerID="aad89a793090bb68fb5a524c94c6a462c553a6246c44de285ab55752a02f437c" exitCode=0 Dec 16 08:27:35 crc kubenswrapper[4789]: I1216 08:27:35.412058 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895d86687-pbz9f" event={"ID":"647d85e5-d8f3-466d-b348-52bfd9196717","Type":"ContainerDied","Data":"aad89a793090bb68fb5a524c94c6a462c553a6246c44de285ab55752a02f437c"} Dec 16 08:27:35 crc kubenswrapper[4789]: I1216 08:27:35.422441 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c7fe2a8b-685e-46d1-8890-7f4a0d752135","Type":"ContainerStarted","Data":"7c0cb241ed3a3f46849d14fdf2703960eba985f51d7032dc3e35fc99fc4c72f3"} Dec 16 08:27:35 crc kubenswrapper[4789]: I1216 08:27:35.422484 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c7fe2a8b-685e-46d1-8890-7f4a0d752135","Type":"ContainerStarted","Data":"7faedb8e83086ebfb62b972c774b07edaa13f265030d6ac9697ade698df28278"} Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.441717 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c7fe2a8b-685e-46d1-8890-7f4a0d752135","Type":"ContainerStarted","Data":"26739cb978985e975a9428c224670d2891ca0f8d1ebd8a16b18aaa9b25455c4d"} Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.442365 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.448746 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7f770399-9fac-4410-b6be-ecb2830512c5","Type":"ContainerStarted","Data":"49eddec3882afb31be11a2f3300438313e71c669de9d5fd62544408aaa40de92"} Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.452504 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895d86687-pbz9f" event={"ID":"647d85e5-d8f3-466d-b348-52bfd9196717","Type":"ContainerStarted","Data":"84eb2cf8cb71f818502773af72b5110db1d8259f3833f0b67d88fd3d42c4741f"} Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.453339 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.474998 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.474979807 podStartE2EDuration="3.474979807s" podCreationTimestamp="2025-12-16 08:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:27:36.468307524 +0000 UTC m=+5794.730195153" watchObservedRunningTime="2025-12-16 08:27:36.474979807 +0000 UTC m=+5794.736867446" Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.493328 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.121260341 podStartE2EDuration="4.493311985s" podCreationTimestamp="2025-12-16 08:27:32 +0000 UTC" firstStartedPulling="2025-12-16 08:27:34.224041052 +0000 UTC m=+5792.485928691" lastFinishedPulling="2025-12-16 08:27:34.596092706 +0000 UTC m=+5792.857980335" observedRunningTime="2025-12-16 08:27:36.491156952 +0000 UTC m=+5794.753044601" watchObservedRunningTime="2025-12-16 08:27:36.493311985 +0000 UTC m=+5794.755199614" Dec 16 08:27:36 crc kubenswrapper[4789]: I1216 08:27:36.512864 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895d86687-pbz9f" podStartSLOduration=4.512845072 podStartE2EDuration="4.512845072s" podCreationTimestamp="2025-12-16 08:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:27:36.509941081 +0000 UTC m=+5794.771828710" watchObservedRunningTime="2025-12-16 08:27:36.512845072 +0000 UTC m=+5794.774732701" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.186379 4789 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb19174f6-911d-4d14-a9b1-c7e6c555f4ac"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb19174f6-911d-4d14-a9b1-c7e6c555f4ac] : Timed out while waiting for systemd to remove kubepods-besteffort-podb19174f6_911d_4d14_a9b1_c7e6c555f4ac.slice" Dec 16 08:27:38 crc kubenswrapper[4789]: E1216 08:27:38.186665 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podb19174f6-911d-4d14-a9b1-c7e6c555f4ac] : unable to destroy cgroup paths for cgroup [kubepods besteffort podb19174f6-911d-4d14-a9b1-c7e6c555f4ac] : Timed out while waiting for systemd to remove kubepods-besteffort-podb19174f6_911d_4d14_a9b1_c7e6c555f4ac.slice" pod="openstack/ceilometer-0" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.470812 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.520747 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.532018 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.542927 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.545439 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.549626 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.550078 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.555487 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.635188 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.635276 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvmbn\" (UniqueName: \"kubernetes.io/projected/8dc4e338-0a4d-48d5-a066-837336d06d5f-kube-api-access-vvmbn\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.635375 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.635400 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-run-httpd\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.635417 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-log-httpd\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.635433 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-config-data\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.635463 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-scripts\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.737264 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-config-data\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.737588 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-scripts\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.737788 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.737950 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvmbn\" (UniqueName: \"kubernetes.io/projected/8dc4e338-0a4d-48d5-a066-837336d06d5f-kube-api-access-vvmbn\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.738153 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.738254 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-run-httpd\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.738351 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-log-httpd\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.738857 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-log-httpd\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.738996 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-run-httpd\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.744385 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-scripts\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.744448 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.745225 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.745294 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-config-data\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.759686 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvmbn\" (UniqueName: \"kubernetes.io/projected/8dc4e338-0a4d-48d5-a066-837336d06d5f-kube-api-access-vvmbn\") pod \"ceilometer-0\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " pod="openstack/ceilometer-0" Dec 16 08:27:38 crc kubenswrapper[4789]: I1216 08:27:38.863424 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:39 crc kubenswrapper[4789]: I1216 08:27:39.402223 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:39 crc kubenswrapper[4789]: I1216 08:27:39.483488 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerStarted","Data":"cd743261a362e66f59c6df6359041dce6cc68c7d2f1d07ee869539c6ffceb632"} Dec 16 08:27:40 crc kubenswrapper[4789]: I1216 08:27:40.127898 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19174f6-911d-4d14-a9b1-c7e6c555f4ac" path="/var/lib/kubelet/pods/b19174f6-911d-4d14-a9b1-c7e6c555f4ac/volumes" Dec 16 08:27:40 crc kubenswrapper[4789]: I1216 08:27:40.496888 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerStarted","Data":"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8"} Dec 16 08:27:42 crc kubenswrapper[4789]: I1216 08:27:42.515720 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"7a00d6fa-713d-4297-b5cc-7ca06b736d65","Type":"ContainerStarted","Data":"8840c386d2c8ac49f68ef36bd43ce01af2bf475f8913456a808e1ec11e30ddb8"} Dec 16 08:27:42 crc kubenswrapper[4789]: I1216 08:27:42.520442 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerStarted","Data":"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45"} Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.050053 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wh2l7"] Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.071456 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wh2l7"] Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.081163 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9r2s9"] Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.089993 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9r2s9"] Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.128990 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.242716 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.331002 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f8595455-rc2h8"] Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.331322 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" podUID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerName="dnsmasq-dns" containerID="cri-o://6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1" gracePeriod=10 Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.532602 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerStarted","Data":"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf"} Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.535058 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"7a00d6fa-713d-4297-b5cc-7ca06b736d65","Type":"ContainerStarted","Data":"0969f70595c2d751ce72822883d31823f553d8ceeb987b2280ffb2d0ebe816ef"} Dec 16 08:27:43 crc kubenswrapper[4789]: I1216 08:27:43.570210 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.16293917 podStartE2EDuration="11.570191192s" podCreationTimestamp="2025-12-16 08:27:32 +0000 UTC" firstStartedPulling="2025-12-16 08:27:34.357214978 +0000 UTC m=+5792.619102617" lastFinishedPulling="2025-12-16 08:27:41.764467 +0000 UTC m=+5800.026354639" observedRunningTime="2025-12-16 08:27:43.559453079 +0000 UTC m=+5801.821340728" watchObservedRunningTime="2025-12-16 08:27:43.570191192 +0000 UTC m=+5801.832078821" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.029967 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2aee-account-create-update-kzmtw"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.041749 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d3bb-account-create-update-pmmb7"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.053517 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3c4f-account-create-update-kqkc6"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.089333 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2aee-account-create-update-kzmtw"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.121432 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ce2586-6062-48bc-a867-37d682d9b3b1" path="/var/lib/kubelet/pods/07ce2586-6062-48bc-a867-37d682d9b3b1/volumes" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.122047 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d072154-82fc-4258-943f-1900efa7273c" path="/var/lib/kubelet/pods/7d072154-82fc-4258-943f-1900efa7273c/volumes" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.122686 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83111890-5086-4173-a090-084f8d14334e" path="/var/lib/kubelet/pods/83111890-5086-4173-a090-084f8d14334e/volumes" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.123404 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fxgvr"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.124852 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d3bb-account-create-update-pmmb7"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.142617 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3c4f-account-create-update-kqkc6"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.154372 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fxgvr"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.493483 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.561284 4789 generic.go:334] "Generic (PLEG): container finished" podID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerID="6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1" exitCode=0 Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.562587 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.563268 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" event={"ID":"4094f3ce-80f9-4542-ab32-9fb80b03b11b","Type":"ContainerDied","Data":"6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1"} Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.563306 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86f8595455-rc2h8" event={"ID":"4094f3ce-80f9-4542-ab32-9fb80b03b11b","Type":"ContainerDied","Data":"a248f107961e40338c09a322a826d38dd1d7589d5de160420f53ece996bb19d8"} Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.563328 4789 scope.go:117] "RemoveContainer" containerID="6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.598828 4789 scope.go:117] "RemoveContainer" containerID="b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.624705 4789 scope.go:117] "RemoveContainer" containerID="6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1" Dec 16 08:27:44 crc kubenswrapper[4789]: E1216 08:27:44.629372 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1\": container with ID starting with 6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1 not found: ID does not exist" containerID="6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.629415 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1"} err="failed to get container status \"6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1\": rpc error: code = NotFound desc = could not find container \"6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1\": container with ID starting with 6e9fd4d74035358911d05113bbbdb236d66f983d9df81e4be5ff6671f76bc7c1 not found: ID does not exist" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.629440 4789 scope.go:117] "RemoveContainer" containerID="b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc" Dec 16 08:27:44 crc kubenswrapper[4789]: E1216 08:27:44.629959 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc\": container with ID starting with b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc not found: ID does not exist" containerID="b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.629987 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc"} err="failed to get container status \"b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc\": rpc error: code = NotFound desc = could not find container \"b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc\": container with ID starting with b0cb09a2ce013cba4db3984781ecf858831283eec03e66c43188c0ee96bd5bbc not found: ID does not exist" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.671317 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-sb\") pod \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.671448 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlfk8\" (UniqueName: \"kubernetes.io/projected/4094f3ce-80f9-4542-ab32-9fb80b03b11b-kube-api-access-hlfk8\") pod \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.671477 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-nb\") pod \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.671513 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-config\") pod \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.671552 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-dns-svc\") pod \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\" (UID: \"4094f3ce-80f9-4542-ab32-9fb80b03b11b\") " Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.678401 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4094f3ce-80f9-4542-ab32-9fb80b03b11b-kube-api-access-hlfk8" (OuterVolumeSpecName: "kube-api-access-hlfk8") pod "4094f3ce-80f9-4542-ab32-9fb80b03b11b" (UID: "4094f3ce-80f9-4542-ab32-9fb80b03b11b"). InnerVolumeSpecName "kube-api-access-hlfk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.731837 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4094f3ce-80f9-4542-ab32-9fb80b03b11b" (UID: "4094f3ce-80f9-4542-ab32-9fb80b03b11b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.735280 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4094f3ce-80f9-4542-ab32-9fb80b03b11b" (UID: "4094f3ce-80f9-4542-ab32-9fb80b03b11b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.736797 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-config" (OuterVolumeSpecName: "config") pod "4094f3ce-80f9-4542-ab32-9fb80b03b11b" (UID: "4094f3ce-80f9-4542-ab32-9fb80b03b11b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.751113 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4094f3ce-80f9-4542-ab32-9fb80b03b11b" (UID: "4094f3ce-80f9-4542-ab32-9fb80b03b11b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.777942 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.778025 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlfk8\" (UniqueName: \"kubernetes.io/projected/4094f3ce-80f9-4542-ab32-9fb80b03b11b-kube-api-access-hlfk8\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.778040 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.778049 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.778059 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4094f3ce-80f9-4542-ab32-9fb80b03b11b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.907320 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86f8595455-rc2h8"] Dec 16 08:27:44 crc kubenswrapper[4789]: I1216 08:27:44.918813 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86f8595455-rc2h8"] Dec 16 08:27:45 crc kubenswrapper[4789]: I1216 08:27:45.579172 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerStarted","Data":"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9"} Dec 16 08:27:45 crc kubenswrapper[4789]: I1216 08:27:45.580677 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 08:27:45 crc kubenswrapper[4789]: I1216 08:27:45.608854 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.802164157 podStartE2EDuration="7.608835978s" podCreationTimestamp="2025-12-16 08:27:38 +0000 UTC" firstStartedPulling="2025-12-16 08:27:39.409140214 +0000 UTC m=+5797.671027843" lastFinishedPulling="2025-12-16 08:27:44.215812035 +0000 UTC m=+5802.477699664" observedRunningTime="2025-12-16 08:27:45.599394697 +0000 UTC m=+5803.861282326" watchObservedRunningTime="2025-12-16 08:27:45.608835978 +0000 UTC m=+5803.870723607" Dec 16 08:27:46 crc kubenswrapper[4789]: I1216 08:27:46.083772 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:46 crc kubenswrapper[4789]: I1216 08:27:46.117213 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0" path="/var/lib/kubelet/pods/2bdeb5a0-5c0b-48f1-b444-0b0eda2d1bb0/volumes" Dec 16 08:27:46 crc kubenswrapper[4789]: I1216 08:27:46.117909 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" path="/var/lib/kubelet/pods/4094f3ce-80f9-4542-ab32-9fb80b03b11b/volumes" Dec 16 08:27:46 crc kubenswrapper[4789]: I1216 08:27:46.118528 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805906ee-0f3c-48a2-bbe5-c294c6299888" path="/var/lib/kubelet/pods/805906ee-0f3c-48a2-bbe5-c294c6299888/volumes" Dec 16 08:27:46 crc kubenswrapper[4789]: I1216 08:27:46.119625 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a474cd62-d32b-4059-bacc-f878b03ffbfb" path="/var/lib/kubelet/pods/a474cd62-d32b-4059-bacc-f878b03ffbfb/volumes" Dec 16 08:27:47 crc kubenswrapper[4789]: I1216 08:27:47.599134 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-central-agent" containerID="cri-o://d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" gracePeriod=30 Dec 16 08:27:47 crc kubenswrapper[4789]: I1216 08:27:47.599206 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-notification-agent" containerID="cri-o://21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" gracePeriod=30 Dec 16 08:27:47 crc kubenswrapper[4789]: I1216 08:27:47.599208 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="proxy-httpd" containerID="cri-o://8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" gracePeriod=30 Dec 16 08:27:47 crc kubenswrapper[4789]: I1216 08:27:47.599181 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="sg-core" containerID="cri-o://b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" gracePeriod=30 Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.463374 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612286 4789 generic.go:334] "Generic (PLEG): container finished" podID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerID="8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" exitCode=0 Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612320 4789 generic.go:334] "Generic (PLEG): container finished" podID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerID="b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" exitCode=2 Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612329 4789 generic.go:334] "Generic (PLEG): container finished" podID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerID="21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" exitCode=0 Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612336 4789 generic.go:334] "Generic (PLEG): container finished" podID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerID="d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" exitCode=0 Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612357 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerDied","Data":"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9"} Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612372 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612383 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerDied","Data":"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf"} Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612395 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerDied","Data":"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45"} Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612403 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerDied","Data":"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8"} Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612411 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8dc4e338-0a4d-48d5-a066-837336d06d5f","Type":"ContainerDied","Data":"cd743261a362e66f59c6df6359041dce6cc68c7d2f1d07ee869539c6ffceb632"} Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.612427 4789 scope.go:117] "RemoveContainer" containerID="8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.642906 4789 scope.go:117] "RemoveContainer" containerID="b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.652591 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-sg-core-conf-yaml\") pod \"8dc4e338-0a4d-48d5-a066-837336d06d5f\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.652687 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvmbn\" (UniqueName: \"kubernetes.io/projected/8dc4e338-0a4d-48d5-a066-837336d06d5f-kube-api-access-vvmbn\") pod \"8dc4e338-0a4d-48d5-a066-837336d06d5f\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.652797 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-log-httpd\") pod \"8dc4e338-0a4d-48d5-a066-837336d06d5f\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.652939 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-run-httpd\") pod \"8dc4e338-0a4d-48d5-a066-837336d06d5f\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.652978 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-config-data\") pod \"8dc4e338-0a4d-48d5-a066-837336d06d5f\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.653037 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-scripts\") pod \"8dc4e338-0a4d-48d5-a066-837336d06d5f\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.653061 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-combined-ca-bundle\") pod \"8dc4e338-0a4d-48d5-a066-837336d06d5f\" (UID: \"8dc4e338-0a4d-48d5-a066-837336d06d5f\") " Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.653843 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8dc4e338-0a4d-48d5-a066-837336d06d5f" (UID: "8dc4e338-0a4d-48d5-a066-837336d06d5f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.656479 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8dc4e338-0a4d-48d5-a066-837336d06d5f" (UID: "8dc4e338-0a4d-48d5-a066-837336d06d5f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.658932 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc4e338-0a4d-48d5-a066-837336d06d5f-kube-api-access-vvmbn" (OuterVolumeSpecName: "kube-api-access-vvmbn") pod "8dc4e338-0a4d-48d5-a066-837336d06d5f" (UID: "8dc4e338-0a4d-48d5-a066-837336d06d5f"). InnerVolumeSpecName "kube-api-access-vvmbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.659102 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-scripts" (OuterVolumeSpecName: "scripts") pod "8dc4e338-0a4d-48d5-a066-837336d06d5f" (UID: "8dc4e338-0a4d-48d5-a066-837336d06d5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.669508 4789 scope.go:117] "RemoveContainer" containerID="21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.687704 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8dc4e338-0a4d-48d5-a066-837336d06d5f" (UID: "8dc4e338-0a4d-48d5-a066-837336d06d5f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.695285 4789 scope.go:117] "RemoveContainer" containerID="d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.720111 4789 scope.go:117] "RemoveContainer" containerID="8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" Dec 16 08:27:48 crc kubenswrapper[4789]: E1216 08:27:48.721111 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": container with ID starting with 8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9 not found: ID does not exist" containerID="8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.721154 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9"} err="failed to get container status \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": rpc error: code = NotFound desc = could not find container \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": container with ID starting with 8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.721183 4789 scope.go:117] "RemoveContainer" containerID="b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" Dec 16 08:27:48 crc kubenswrapper[4789]: E1216 08:27:48.721757 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": container with ID starting with b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf not found: ID does not exist" containerID="b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.721797 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf"} err="failed to get container status \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": rpc error: code = NotFound desc = could not find container \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": container with ID starting with b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.721826 4789 scope.go:117] "RemoveContainer" containerID="21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" Dec 16 08:27:48 crc kubenswrapper[4789]: E1216 08:27:48.727763 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": container with ID starting with 21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45 not found: ID does not exist" containerID="21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.727794 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45"} err="failed to get container status \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": rpc error: code = NotFound desc = could not find container \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": container with ID starting with 21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.727814 4789 scope.go:117] "RemoveContainer" containerID="d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" Dec 16 08:27:48 crc kubenswrapper[4789]: E1216 08:27:48.728064 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": container with ID starting with d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8 not found: ID does not exist" containerID="d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.728085 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8"} err="failed to get container status \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": rpc error: code = NotFound desc = could not find container \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": container with ID starting with d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.728102 4789 scope.go:117] "RemoveContainer" containerID="8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.728402 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9"} err="failed to get container status \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": rpc error: code = NotFound desc = could not find container \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": container with ID starting with 8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.728422 4789 scope.go:117] "RemoveContainer" containerID="b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.728800 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf"} err="failed to get container status \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": rpc error: code = NotFound desc = could not find container \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": container with ID starting with b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.728905 4789 scope.go:117] "RemoveContainer" containerID="21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.729279 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45"} err="failed to get container status \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": rpc error: code = NotFound desc = could not find container \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": container with ID starting with 21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.729388 4789 scope.go:117] "RemoveContainer" containerID="d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.729774 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8"} err="failed to get container status \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": rpc error: code = NotFound desc = could not find container \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": container with ID starting with d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.729844 4789 scope.go:117] "RemoveContainer" containerID="8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730135 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9"} err="failed to get container status \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": rpc error: code = NotFound desc = could not find container \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": container with ID starting with 8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730156 4789 scope.go:117] "RemoveContainer" containerID="b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730356 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf"} err="failed to get container status \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": rpc error: code = NotFound desc = could not find container \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": container with ID starting with b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730372 4789 scope.go:117] "RemoveContainer" containerID="21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730521 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45"} err="failed to get container status \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": rpc error: code = NotFound desc = could not find container \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": container with ID starting with 21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730535 4789 scope.go:117] "RemoveContainer" containerID="d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730704 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8"} err="failed to get container status \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": rpc error: code = NotFound desc = could not find container \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": container with ID starting with d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.730826 4789 scope.go:117] "RemoveContainer" containerID="8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.731165 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9"} err="failed to get container status \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": rpc error: code = NotFound desc = could not find container \"8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9\": container with ID starting with 8adf78603039d28668ab7be394734f0219a526b09cf3b0f47883c54db57a39f9 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.731185 4789 scope.go:117] "RemoveContainer" containerID="b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.731336 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf"} err="failed to get container status \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": rpc error: code = NotFound desc = could not find container \"b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf\": container with ID starting with b9a362ae3fc050f96726fff56735c8f939377bc7a34bd1cfc1f970523b4e3bbf not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.731353 4789 scope.go:117] "RemoveContainer" containerID="21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.731501 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45"} err="failed to get container status \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": rpc error: code = NotFound desc = could not find container \"21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45\": container with ID starting with 21d46e5f4d4266d0c1119d653c167f13aaa31753c1319e4afa2cae27f83eeb45 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.731520 4789 scope.go:117] "RemoveContainer" containerID="d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.731703 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8"} err="failed to get container status \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": rpc error: code = NotFound desc = could not find container \"d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8\": container with ID starting with d73090ddebc37266ee784b2a8c614d0a96cb404aeb54ceb7d615c110911efde8 not found: ID does not exist" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.733539 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dc4e338-0a4d-48d5-a066-837336d06d5f" (UID: "8dc4e338-0a4d-48d5-a066-837336d06d5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.755636 4789 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.755698 4789 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.755713 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.755727 4789 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.755739 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvmbn\" (UniqueName: \"kubernetes.io/projected/8dc4e338-0a4d-48d5-a066-837336d06d5f-kube-api-access-vvmbn\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.755751 4789 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8dc4e338-0a4d-48d5-a066-837336d06d5f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.757643 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-config-data" (OuterVolumeSpecName: "config-data") pod "8dc4e338-0a4d-48d5-a066-837336d06d5f" (UID: "8dc4e338-0a4d-48d5-a066-837336d06d5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.857677 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dc4e338-0a4d-48d5-a066-837336d06d5f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.962496 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:48 crc kubenswrapper[4789]: I1216 08:27:48.998902 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.014362 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:49 crc kubenswrapper[4789]: E1216 08:27:49.014790 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-notification-agent" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.014807 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-notification-agent" Dec 16 08:27:49 crc kubenswrapper[4789]: E1216 08:27:49.014820 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerName="dnsmasq-dns" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.014826 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerName="dnsmasq-dns" Dec 16 08:27:49 crc kubenswrapper[4789]: E1216 08:27:49.014840 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="sg-core" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.014846 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="sg-core" Dec 16 08:27:49 crc kubenswrapper[4789]: E1216 08:27:49.014860 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-central-agent" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.014865 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-central-agent" Dec 16 08:27:49 crc kubenswrapper[4789]: E1216 08:27:49.014879 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerName="init" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.014885 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerName="init" Dec 16 08:27:49 crc kubenswrapper[4789]: E1216 08:27:49.014893 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="proxy-httpd" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.014899 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="proxy-httpd" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.015090 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="proxy-httpd" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.015104 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4094f3ce-80f9-4542-ab32-9fb80b03b11b" containerName="dnsmasq-dns" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.015118 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-central-agent" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.015133 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="sg-core" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.015141 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" containerName="ceilometer-notification-agent" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.016965 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.019188 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.019778 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.029897 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.168493 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpm48\" (UniqueName: \"kubernetes.io/projected/0f488b83-1fc0-40a5-be04-50e5267d4792-kube-api-access-bpm48\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.168575 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-scripts\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.168614 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f488b83-1fc0-40a5-be04-50e5267d4792-log-httpd\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.168774 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f488b83-1fc0-40a5-be04-50e5267d4792-run-httpd\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.168977 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.169026 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-config-data\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.169262 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.272019 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpm48\" (UniqueName: \"kubernetes.io/projected/0f488b83-1fc0-40a5-be04-50e5267d4792-kube-api-access-bpm48\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.272111 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-scripts\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.272158 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f488b83-1fc0-40a5-be04-50e5267d4792-log-httpd\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.272190 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f488b83-1fc0-40a5-be04-50e5267d4792-run-httpd\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.272244 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.272266 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-config-data\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.272375 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.273114 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f488b83-1fc0-40a5-be04-50e5267d4792-run-httpd\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.273723 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f488b83-1fc0-40a5-be04-50e5267d4792-log-httpd\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.278481 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.282740 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-scripts\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.283033 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-config-data\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.290987 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f488b83-1fc0-40a5-be04-50e5267d4792-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.293573 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpm48\" (UniqueName: \"kubernetes.io/projected/0f488b83-1fc0-40a5-be04-50e5267d4792-kube-api-access-bpm48\") pod \"ceilometer-0\" (UID: \"0f488b83-1fc0-40a5-be04-50e5267d4792\") " pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.340745 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 08:27:49 crc kubenswrapper[4789]: I1216 08:27:49.780900 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 08:27:50 crc kubenswrapper[4789]: I1216 08:27:50.117448 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc4e338-0a4d-48d5-a066-837336d06d5f" path="/var/lib/kubelet/pods/8dc4e338-0a4d-48d5-a066-837336d06d5f/volumes" Dec 16 08:27:50 crc kubenswrapper[4789]: I1216 08:27:50.634057 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f488b83-1fc0-40a5-be04-50e5267d4792","Type":"ContainerStarted","Data":"b6c00605e0739d4ca3cc8a8a3f09419c3ebfc726405ae3bfe1b32575f7965fe3"} Dec 16 08:27:50 crc kubenswrapper[4789]: I1216 08:27:50.634098 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f488b83-1fc0-40a5-be04-50e5267d4792","Type":"ContainerStarted","Data":"f7d1501d46f9d2242463cc980c258c04869d6bd3a810039d9cc6439a9d5f6cb6"} Dec 16 08:27:50 crc kubenswrapper[4789]: I1216 08:27:50.634112 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f488b83-1fc0-40a5-be04-50e5267d4792","Type":"ContainerStarted","Data":"512bbe3d8acc7dc7f3664e43ffee9bcf77fb65dd7f2b0a60254955ecae91cd14"} Dec 16 08:27:51 crc kubenswrapper[4789]: I1216 08:27:51.645129 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f488b83-1fc0-40a5-be04-50e5267d4792","Type":"ContainerStarted","Data":"18c235560c54bc93bcb29307ba165ec34354cdd51d52db82aae6c3db79f91534"} Dec 16 08:27:51 crc kubenswrapper[4789]: I1216 08:27:51.927459 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:27:51 crc kubenswrapper[4789]: I1216 08:27:51.927538 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:27:51 crc kubenswrapper[4789]: I1216 08:27:51.927599 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:27:51 crc kubenswrapper[4789]: I1216 08:27:51.928448 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93694727a36708bd1006bf75a00f6ac1c8b551c001d91ed3b60ce8e5c8ebae39"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:27:51 crc kubenswrapper[4789]: I1216 08:27:51.928503 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://93694727a36708bd1006bf75a00f6ac1c8b551c001d91ed3b60ce8e5c8ebae39" gracePeriod=600 Dec 16 08:27:52 crc kubenswrapper[4789]: I1216 08:27:52.657233 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="93694727a36708bd1006bf75a00f6ac1c8b551c001d91ed3b60ce8e5c8ebae39" exitCode=0 Dec 16 08:27:52 crc kubenswrapper[4789]: I1216 08:27:52.657415 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"93694727a36708bd1006bf75a00f6ac1c8b551c001d91ed3b60ce8e5c8ebae39"} Dec 16 08:27:52 crc kubenswrapper[4789]: I1216 08:27:52.658790 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d"} Dec 16 08:27:52 crc kubenswrapper[4789]: I1216 08:27:52.658868 4789 scope.go:117] "RemoveContainer" containerID="2a31c088b434905bb2e769df941c576b9f5fcfa925b4cbc9fbbcb214949b7987" Dec 16 08:27:53 crc kubenswrapper[4789]: I1216 08:27:53.158969 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 16 08:27:53 crc kubenswrapper[4789]: I1216 08:27:53.674062 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f488b83-1fc0-40a5-be04-50e5267d4792","Type":"ContainerStarted","Data":"b3bc00519aa93d9cafb3b44b4100b45c2222ae88c8741fae9728a113a5c58b14"} Dec 16 08:27:53 crc kubenswrapper[4789]: I1216 08:27:53.675075 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 08:27:53 crc kubenswrapper[4789]: I1216 08:27:53.704641 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5576324919999998 podStartE2EDuration="5.704624502s" podCreationTimestamp="2025-12-16 08:27:48 +0000 UTC" firstStartedPulling="2025-12-16 08:27:49.787308072 +0000 UTC m=+5808.049195701" lastFinishedPulling="2025-12-16 08:27:52.934300082 +0000 UTC m=+5811.196187711" observedRunningTime="2025-12-16 08:27:53.703416093 +0000 UTC m=+5811.965303732" watchObservedRunningTime="2025-12-16 08:27:53.704624502 +0000 UTC m=+5811.966512131" Dec 16 08:27:54 crc kubenswrapper[4789]: I1216 08:27:54.820111 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 16 08:27:54 crc kubenswrapper[4789]: I1216 08:27:54.835019 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 16 08:27:54 crc kubenswrapper[4789]: I1216 08:27:54.933860 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 16 08:28:19 crc kubenswrapper[4789]: I1216 08:28:19.047051 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pskhf"] Dec 16 08:28:19 crc kubenswrapper[4789]: I1216 08:28:19.057579 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pskhf"] Dec 16 08:28:19 crc kubenswrapper[4789]: I1216 08:28:19.347818 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 08:28:20 crc kubenswrapper[4789]: I1216 08:28:20.115393 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd40f148-e2fa-49e9-8ab9-dec31881d548" path="/var/lib/kubelet/pods/fd40f148-e2fa-49e9-8ab9-dec31881d548/volumes" Dec 16 08:28:32 crc kubenswrapper[4789]: I1216 08:28:32.162962 4789 scope.go:117] "RemoveContainer" containerID="2417fc65e2b0cf426aaceb5a0880c40bd7ffcbc8e94f73929c6af828f70f2b4e" Dec 16 08:28:32 crc kubenswrapper[4789]: I1216 08:28:32.194717 4789 scope.go:117] "RemoveContainer" containerID="12087f0bc2306459f2239259278194efb105974ffa8f01d1ee4782a5e25efb29" Dec 16 08:28:32 crc kubenswrapper[4789]: I1216 08:28:32.244419 4789 scope.go:117] "RemoveContainer" containerID="7f0ac1988ec8a507c83102a5b9ed1eed69eeff1319a0336ab5303bfd8a69c8b4" Dec 16 08:28:32 crc kubenswrapper[4789]: I1216 08:28:32.292246 4789 scope.go:117] "RemoveContainer" containerID="49c0000cc884030bd91263ba067e1c0efe9e9030220903e1b24a1bc129940433" Dec 16 08:28:32 crc kubenswrapper[4789]: I1216 08:28:32.337450 4789 scope.go:117] "RemoveContainer" containerID="62e3c8a1bf1ff34a19a392cfbd88506cbd96e8d50f3c63eebfacdb83356de130" Dec 16 08:28:32 crc kubenswrapper[4789]: I1216 08:28:32.387374 4789 scope.go:117] "RemoveContainer" containerID="73b8eef75e07b02bf777e0d0ecaaf6909a58423a954892f9c2a775ae8bcd1e38" Dec 16 08:28:32 crc kubenswrapper[4789]: I1216 08:28:32.441247 4789 scope.go:117] "RemoveContainer" containerID="a0e523e6d7bdbac1a4c437a0b340c7e619fe2645c5eb32c8f21e649463b3fe55" Dec 16 08:28:37 crc kubenswrapper[4789]: I1216 08:28:37.046707 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k2fd8"] Dec 16 08:28:37 crc kubenswrapper[4789]: I1216 08:28:37.056135 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k2fd8"] Dec 16 08:28:38 crc kubenswrapper[4789]: I1216 08:28:38.029154 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm7gp"] Dec 16 08:28:38 crc kubenswrapper[4789]: I1216 08:28:38.038252 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm7gp"] Dec 16 08:28:38 crc kubenswrapper[4789]: I1216 08:28:38.115060 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452f63cc-39ba-453e-89dc-c8537fd2ff30" path="/var/lib/kubelet/pods/452f63cc-39ba-453e-89dc-c8537fd2ff30/volumes" Dec 16 08:28:38 crc kubenswrapper[4789]: I1216 08:28:38.115694 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1b68ad-31cf-492b-a3c6-eae046daf5e0" path="/var/lib/kubelet/pods/be1b68ad-31cf-492b-a3c6-eae046daf5e0/volumes" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.504640 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56cd69c9d9-tlh64"] Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.508129 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.510967 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.524334 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56cd69c9d9-tlh64"] Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.613036 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-config\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.613099 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-dns-svc\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.613136 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslwf\" (UniqueName: \"kubernetes.io/projected/43ced17c-1f87-4dd4-a572-eb5a865ffc35-kube-api-access-mslwf\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.613298 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-sb\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.613392 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-openstack-cell1\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.613556 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-nb\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.715776 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mslwf\" (UniqueName: \"kubernetes.io/projected/43ced17c-1f87-4dd4-a572-eb5a865ffc35-kube-api-access-mslwf\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.715837 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-sb\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.715876 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-openstack-cell1\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.715965 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-nb\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.716147 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-config\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.716178 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-dns-svc\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.716840 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-sb\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.717100 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-openstack-cell1\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.717186 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-dns-svc\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.717202 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-config\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.717609 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-nb\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.738066 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslwf\" (UniqueName: \"kubernetes.io/projected/43ced17c-1f87-4dd4-a572-eb5a865ffc35-kube-api-access-mslwf\") pod \"dnsmasq-dns-56cd69c9d9-tlh64\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:41 crc kubenswrapper[4789]: I1216 08:28:41.840793 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:42 crc kubenswrapper[4789]: I1216 08:28:42.294889 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56cd69c9d9-tlh64"] Dec 16 08:28:42 crc kubenswrapper[4789]: W1216 08:28:42.295095 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43ced17c_1f87_4dd4_a572_eb5a865ffc35.slice/crio-20ce485c09e12534ccdb672d8ba53eacdab90bc0f3e249067696941bc1b868d2 WatchSource:0}: Error finding container 20ce485c09e12534ccdb672d8ba53eacdab90bc0f3e249067696941bc1b868d2: Status 404 returned error can't find the container with id 20ce485c09e12534ccdb672d8ba53eacdab90bc0f3e249067696941bc1b868d2 Dec 16 08:28:43 crc kubenswrapper[4789]: I1216 08:28:43.122327 4789 generic.go:334] "Generic (PLEG): container finished" podID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerID="109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01" exitCode=0 Dec 16 08:28:43 crc kubenswrapper[4789]: I1216 08:28:43.122439 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" event={"ID":"43ced17c-1f87-4dd4-a572-eb5a865ffc35","Type":"ContainerDied","Data":"109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01"} Dec 16 08:28:43 crc kubenswrapper[4789]: I1216 08:28:43.122614 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" event={"ID":"43ced17c-1f87-4dd4-a572-eb5a865ffc35","Type":"ContainerStarted","Data":"20ce485c09e12534ccdb672d8ba53eacdab90bc0f3e249067696941bc1b868d2"} Dec 16 08:28:44 crc kubenswrapper[4789]: I1216 08:28:44.132080 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" event={"ID":"43ced17c-1f87-4dd4-a572-eb5a865ffc35","Type":"ContainerStarted","Data":"8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a"} Dec 16 08:28:44 crc kubenswrapper[4789]: I1216 08:28:44.132658 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:44 crc kubenswrapper[4789]: I1216 08:28:44.156645 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" podStartSLOduration=3.156610991 podStartE2EDuration="3.156610991s" podCreationTimestamp="2025-12-16 08:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:28:44.148744699 +0000 UTC m=+5862.410632348" watchObservedRunningTime="2025-12-16 08:28:44.156610991 +0000 UTC m=+5862.418498610" Dec 16 08:28:51 crc kubenswrapper[4789]: I1216 08:28:51.843147 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:28:51 crc kubenswrapper[4789]: I1216 08:28:51.919135 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895d86687-pbz9f"] Dec 16 08:28:51 crc kubenswrapper[4789]: I1216 08:28:51.919359 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895d86687-pbz9f" podUID="647d85e5-d8f3-466d-b348-52bfd9196717" containerName="dnsmasq-dns" containerID="cri-o://84eb2cf8cb71f818502773af72b5110db1d8259f3833f0b67d88fd3d42c4741f" gracePeriod=10 Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.052837 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8494b7758f-dvnll"] Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.054647 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.072483 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8494b7758f-dvnll"] Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.137595 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-openstack-cell1\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.137663 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-config\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.137713 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-dns-svc\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.137741 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-ovsdbserver-nb\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.137793 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-ovsdbserver-sb\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.137820 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtpv8\" (UniqueName: \"kubernetes.io/projected/00a6f21f-4c8b-423c-b645-2e9ff6222c95-kube-api-access-vtpv8\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.229721 4789 generic.go:334] "Generic (PLEG): container finished" podID="647d85e5-d8f3-466d-b348-52bfd9196717" containerID="84eb2cf8cb71f818502773af72b5110db1d8259f3833f0b67d88fd3d42c4741f" exitCode=0 Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.229760 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895d86687-pbz9f" event={"ID":"647d85e5-d8f3-466d-b348-52bfd9196717","Type":"ContainerDied","Data":"84eb2cf8cb71f818502773af72b5110db1d8259f3833f0b67d88fd3d42c4741f"} Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.239418 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-dns-svc\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.239475 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-ovsdbserver-nb\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.239535 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-ovsdbserver-sb\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.239561 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtpv8\" (UniqueName: \"kubernetes.io/projected/00a6f21f-4c8b-423c-b645-2e9ff6222c95-kube-api-access-vtpv8\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.239630 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-openstack-cell1\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.239675 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-config\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.240450 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-config\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.240463 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-dns-svc\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.241173 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-ovsdbserver-nb\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.241357 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-ovsdbserver-sb\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.241818 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/00a6f21f-4c8b-423c-b645-2e9ff6222c95-openstack-cell1\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.278310 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtpv8\" (UniqueName: \"kubernetes.io/projected/00a6f21f-4c8b-423c-b645-2e9ff6222c95-kube-api-access-vtpv8\") pod \"dnsmasq-dns-8494b7758f-dvnll\" (UID: \"00a6f21f-4c8b-423c-b645-2e9ff6222c95\") " pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.408369 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.533264 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.646960 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-config\") pod \"647d85e5-d8f3-466d-b348-52bfd9196717\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.647165 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-dns-svc\") pod \"647d85e5-d8f3-466d-b348-52bfd9196717\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.647218 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-sb\") pod \"647d85e5-d8f3-466d-b348-52bfd9196717\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.647253 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-nb\") pod \"647d85e5-d8f3-466d-b348-52bfd9196717\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.647305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdnd8\" (UniqueName: \"kubernetes.io/projected/647d85e5-d8f3-466d-b348-52bfd9196717-kube-api-access-jdnd8\") pod \"647d85e5-d8f3-466d-b348-52bfd9196717\" (UID: \"647d85e5-d8f3-466d-b348-52bfd9196717\") " Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.652427 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647d85e5-d8f3-466d-b348-52bfd9196717-kube-api-access-jdnd8" (OuterVolumeSpecName: "kube-api-access-jdnd8") pod "647d85e5-d8f3-466d-b348-52bfd9196717" (UID: "647d85e5-d8f3-466d-b348-52bfd9196717"). InnerVolumeSpecName "kube-api-access-jdnd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.709465 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-config" (OuterVolumeSpecName: "config") pod "647d85e5-d8f3-466d-b348-52bfd9196717" (UID: "647d85e5-d8f3-466d-b348-52bfd9196717"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.710507 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "647d85e5-d8f3-466d-b348-52bfd9196717" (UID: "647d85e5-d8f3-466d-b348-52bfd9196717"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.711456 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "647d85e5-d8f3-466d-b348-52bfd9196717" (UID: "647d85e5-d8f3-466d-b348-52bfd9196717"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.720530 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "647d85e5-d8f3-466d-b348-52bfd9196717" (UID: "647d85e5-d8f3-466d-b348-52bfd9196717"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.749740 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.749771 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.749783 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.749794 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdnd8\" (UniqueName: \"kubernetes.io/projected/647d85e5-d8f3-466d-b348-52bfd9196717-kube-api-access-jdnd8\") on node \"crc\" DevicePath \"\"" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.749803 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/647d85e5-d8f3-466d-b348-52bfd9196717-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:28:52 crc kubenswrapper[4789]: I1216 08:28:52.883597 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8494b7758f-dvnll"] Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.240722 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895d86687-pbz9f" event={"ID":"647d85e5-d8f3-466d-b348-52bfd9196717","Type":"ContainerDied","Data":"482f93eabc17088c616f0385ea1304bde2aeef8b8ceea3251e0c5b43f8ce1c68"} Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.240781 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895d86687-pbz9f" Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.241000 4789 scope.go:117] "RemoveContainer" containerID="84eb2cf8cb71f818502773af72b5110db1d8259f3833f0b67d88fd3d42c4741f" Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.253865 4789 generic.go:334] "Generic (PLEG): container finished" podID="00a6f21f-4c8b-423c-b645-2e9ff6222c95" containerID="68c9b73e4eb03ac928ca255fea49d99c7ed16b9ee0126345bfc291b0847144a9" exitCode=0 Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.253902 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" event={"ID":"00a6f21f-4c8b-423c-b645-2e9ff6222c95","Type":"ContainerDied","Data":"68c9b73e4eb03ac928ca255fea49d99c7ed16b9ee0126345bfc291b0847144a9"} Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.253950 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" event={"ID":"00a6f21f-4c8b-423c-b645-2e9ff6222c95","Type":"ContainerStarted","Data":"ba790a459db33ce6f2f926bc2f8cc3223884d86a6290d6bd2c4a1b23230cad4f"} Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.276388 4789 scope.go:117] "RemoveContainer" containerID="aad89a793090bb68fb5a524c94c6a462c553a6246c44de285ab55752a02f437c" Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.313738 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895d86687-pbz9f"] Dec 16 08:28:53 crc kubenswrapper[4789]: I1216 08:28:53.324362 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895d86687-pbz9f"] Dec 16 08:28:54 crc kubenswrapper[4789]: I1216 08:28:54.146535 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647d85e5-d8f3-466d-b348-52bfd9196717" path="/var/lib/kubelet/pods/647d85e5-d8f3-466d-b348-52bfd9196717/volumes" Dec 16 08:28:54 crc kubenswrapper[4789]: I1216 08:28:54.274368 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" event={"ID":"00a6f21f-4c8b-423c-b645-2e9ff6222c95","Type":"ContainerStarted","Data":"cef55e37a1aab241f8fb8adf30aa84e22a61df6ca8043700195746cc87909924"} Dec 16 08:28:54 crc kubenswrapper[4789]: I1216 08:28:54.274546 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:28:54 crc kubenswrapper[4789]: I1216 08:28:54.306096 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" podStartSLOduration=2.306069429 podStartE2EDuration="2.306069429s" podCreationTimestamp="2025-12-16 08:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:28:54.305342441 +0000 UTC m=+5872.567230070" watchObservedRunningTime="2025-12-16 08:28:54.306069429 +0000 UTC m=+5872.567957058" Dec 16 08:28:57 crc kubenswrapper[4789]: I1216 08:28:57.049943 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vf547"] Dec 16 08:28:57 crc kubenswrapper[4789]: I1216 08:28:57.059925 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vf547"] Dec 16 08:28:58 crc kubenswrapper[4789]: I1216 08:28:58.116957 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e" path="/var/lib/kubelet/pods/8a39c3d1-ce07-4b49-a2d2-a5a29fe3731e/volumes" Dec 16 08:29:02 crc kubenswrapper[4789]: I1216 08:29:02.410266 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8494b7758f-dvnll" Dec 16 08:29:02 crc kubenswrapper[4789]: I1216 08:29:02.478585 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cd69c9d9-tlh64"] Dec 16 08:29:02 crc kubenswrapper[4789]: I1216 08:29:02.479635 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" podUID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerName="dnsmasq-dns" containerID="cri-o://8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a" gracePeriod=10 Dec 16 08:29:02 crc kubenswrapper[4789]: I1216 08:29:02.994363 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.084153 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-sb\") pod \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.084218 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-dns-svc\") pod \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.084289 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-config\") pod \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.084388 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-openstack-cell1\") pod \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.084444 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-nb\") pod \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.084497 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mslwf\" (UniqueName: \"kubernetes.io/projected/43ced17c-1f87-4dd4-a572-eb5a865ffc35-kube-api-access-mslwf\") pod \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\" (UID: \"43ced17c-1f87-4dd4-a572-eb5a865ffc35\") " Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.094209 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ced17c-1f87-4dd4-a572-eb5a865ffc35-kube-api-access-mslwf" (OuterVolumeSpecName: "kube-api-access-mslwf") pod "43ced17c-1f87-4dd4-a572-eb5a865ffc35" (UID: "43ced17c-1f87-4dd4-a572-eb5a865ffc35"). InnerVolumeSpecName "kube-api-access-mslwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.141933 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "43ced17c-1f87-4dd4-a572-eb5a865ffc35" (UID: "43ced17c-1f87-4dd4-a572-eb5a865ffc35"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.144770 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-config" (OuterVolumeSpecName: "config") pod "43ced17c-1f87-4dd4-a572-eb5a865ffc35" (UID: "43ced17c-1f87-4dd4-a572-eb5a865ffc35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.149982 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43ced17c-1f87-4dd4-a572-eb5a865ffc35" (UID: "43ced17c-1f87-4dd4-a572-eb5a865ffc35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.150955 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43ced17c-1f87-4dd4-a572-eb5a865ffc35" (UID: "43ced17c-1f87-4dd4-a572-eb5a865ffc35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.152420 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43ced17c-1f87-4dd4-a572-eb5a865ffc35" (UID: "43ced17c-1f87-4dd4-a572-eb5a865ffc35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.188098 4789 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.189689 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.189781 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mslwf\" (UniqueName: \"kubernetes.io/projected/43ced17c-1f87-4dd4-a572-eb5a865ffc35-kube-api-access-mslwf\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.189832 4789 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.189878 4789 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.189942 4789 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43ced17c-1f87-4dd4-a572-eb5a865ffc35-config\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.361223 4789 generic.go:334] "Generic (PLEG): container finished" podID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerID="8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a" exitCode=0 Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.361268 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" event={"ID":"43ced17c-1f87-4dd4-a572-eb5a865ffc35","Type":"ContainerDied","Data":"8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a"} Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.361339 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" event={"ID":"43ced17c-1f87-4dd4-a572-eb5a865ffc35","Type":"ContainerDied","Data":"20ce485c09e12534ccdb672d8ba53eacdab90bc0f3e249067696941bc1b868d2"} Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.361359 4789 scope.go:117] "RemoveContainer" containerID="8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.361362 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cd69c9d9-tlh64" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.400706 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cd69c9d9-tlh64"] Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.407981 4789 scope.go:117] "RemoveContainer" containerID="109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.409164 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56cd69c9d9-tlh64"] Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.430923 4789 scope.go:117] "RemoveContainer" containerID="8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a" Dec 16 08:29:03 crc kubenswrapper[4789]: E1216 08:29:03.431626 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a\": container with ID starting with 8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a not found: ID does not exist" containerID="8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.431671 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a"} err="failed to get container status \"8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a\": rpc error: code = NotFound desc = could not find container \"8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a\": container with ID starting with 8015935d5d4c17158877aa802d2aecd6c127627025c4f0ae271691d9a61fb42a not found: ID does not exist" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.431713 4789 scope.go:117] "RemoveContainer" containerID="109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01" Dec 16 08:29:03 crc kubenswrapper[4789]: E1216 08:29:03.432061 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01\": container with ID starting with 109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01 not found: ID does not exist" containerID="109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01" Dec 16 08:29:03 crc kubenswrapper[4789]: I1216 08:29:03.432095 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01"} err="failed to get container status \"109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01\": rpc error: code = NotFound desc = could not find container \"109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01\": container with ID starting with 109714b93e0a9f5e5e8be7799d629d042190264282446d3617d1dd07e943af01 not found: ID does not exist" Dec 16 08:29:04 crc kubenswrapper[4789]: I1216 08:29:04.116140 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" path="/var/lib/kubelet/pods/43ced17c-1f87-4dd4-a572-eb5a865ffc35/volumes" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.346025 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl"] Dec 16 08:29:13 crc kubenswrapper[4789]: E1216 08:29:13.346898 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerName="init" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.346925 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerName="init" Dec 16 08:29:13 crc kubenswrapper[4789]: E1216 08:29:13.346944 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647d85e5-d8f3-466d-b348-52bfd9196717" containerName="dnsmasq-dns" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.346950 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="647d85e5-d8f3-466d-b348-52bfd9196717" containerName="dnsmasq-dns" Dec 16 08:29:13 crc kubenswrapper[4789]: E1216 08:29:13.346965 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647d85e5-d8f3-466d-b348-52bfd9196717" containerName="init" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.346974 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="647d85e5-d8f3-466d-b348-52bfd9196717" containerName="init" Dec 16 08:29:13 crc kubenswrapper[4789]: E1216 08:29:13.346985 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerName="dnsmasq-dns" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.346992 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerName="dnsmasq-dns" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.347217 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="647d85e5-d8f3-466d-b348-52bfd9196717" containerName="dnsmasq-dns" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.347246 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ced17c-1f87-4dd4-a572-eb5a865ffc35" containerName="dnsmasq-dns" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.350199 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.353516 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.363755 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.364246 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.366003 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.404320 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl"] Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.404896 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.404955 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.405165 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.405293 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxz4\" (UniqueName: \"kubernetes.io/projected/9a8c6e87-54ee-4a74-9754-6eace44ccce0-kube-api-access-tpxz4\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.405452 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.507114 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.507354 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxz4\" (UniqueName: \"kubernetes.io/projected/9a8c6e87-54ee-4a74-9754-6eace44ccce0-kube-api-access-tpxz4\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.507490 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.507609 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.507677 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.514534 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.514593 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.516369 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.526830 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.529312 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxz4\" (UniqueName: \"kubernetes.io/projected/9a8c6e87-54ee-4a74-9754-6eace44ccce0-kube-api-access-tpxz4\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:13 crc kubenswrapper[4789]: I1216 08:29:13.680768 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:14 crc kubenswrapper[4789]: I1216 08:29:14.218714 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl"] Dec 16 08:29:14 crc kubenswrapper[4789]: I1216 08:29:14.456055 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" event={"ID":"9a8c6e87-54ee-4a74-9754-6eace44ccce0","Type":"ContainerStarted","Data":"40e9841deb0975350974f5128b4383dee380f95fc1d8f35623f79a9c516870ec"} Dec 16 08:29:29 crc kubenswrapper[4789]: I1216 08:29:29.590463 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" event={"ID":"9a8c6e87-54ee-4a74-9754-6eace44ccce0","Type":"ContainerStarted","Data":"e7633914af225871665d759bd17705dde9dab29dc723cb45db7e110bbf8cf295"} Dec 16 08:29:29 crc kubenswrapper[4789]: I1216 08:29:29.615681 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" podStartSLOduration=1.759320027 podStartE2EDuration="16.615661507s" podCreationTimestamp="2025-12-16 08:29:13 +0000 UTC" firstStartedPulling="2025-12-16 08:29:14.236367943 +0000 UTC m=+5892.498255572" lastFinishedPulling="2025-12-16 08:29:29.092709423 +0000 UTC m=+5907.354597052" observedRunningTime="2025-12-16 08:29:29.605383586 +0000 UTC m=+5907.867271225" watchObservedRunningTime="2025-12-16 08:29:29.615661507 +0000 UTC m=+5907.877549136" Dec 16 08:29:32 crc kubenswrapper[4789]: I1216 08:29:32.602268 4789 scope.go:117] "RemoveContainer" containerID="cba21a7f86780aeb9cebb526e51452f845c0224e19e738f525da12f2ed20815b" Dec 16 08:29:32 crc kubenswrapper[4789]: I1216 08:29:32.652438 4789 scope.go:117] "RemoveContainer" containerID="37e832098602f0dbb8a0559ccd18310279ae4c905118ed7222be7b524dc075a1" Dec 16 08:29:32 crc kubenswrapper[4789]: I1216 08:29:32.709498 4789 scope.go:117] "RemoveContainer" containerID="c9867257d345bac4aba9e637fc89197fa8bbfdbd1ebeb7adaa9454fa842cd0f4" Dec 16 08:29:38 crc kubenswrapper[4789]: I1216 08:29:38.044302 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mzk6c"] Dec 16 08:29:38 crc kubenswrapper[4789]: I1216 08:29:38.054824 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mzk6c"] Dec 16 08:29:38 crc kubenswrapper[4789]: I1216 08:29:38.117383 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7" path="/var/lib/kubelet/pods/9cbe7fd2-2ab0-4f01-9e3d-b869964ee9a7/volumes" Dec 16 08:29:39 crc kubenswrapper[4789]: I1216 08:29:39.024492 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fd42-account-create-update-4ttgv"] Dec 16 08:29:39 crc kubenswrapper[4789]: I1216 08:29:39.034161 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fd42-account-create-update-4ttgv"] Dec 16 08:29:40 crc kubenswrapper[4789]: I1216 08:29:40.118072 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5673468d-5701-40b7-870a-fb32e6c9d60a" path="/var/lib/kubelet/pods/5673468d-5701-40b7-870a-fb32e6c9d60a/volumes" Dec 16 08:29:41 crc kubenswrapper[4789]: I1216 08:29:41.698874 4789 generic.go:334] "Generic (PLEG): container finished" podID="9a8c6e87-54ee-4a74-9754-6eace44ccce0" containerID="e7633914af225871665d759bd17705dde9dab29dc723cb45db7e110bbf8cf295" exitCode=0 Dec 16 08:29:41 crc kubenswrapper[4789]: I1216 08:29:41.698932 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" event={"ID":"9a8c6e87-54ee-4a74-9754-6eace44ccce0","Type":"ContainerDied","Data":"e7633914af225871665d759bd17705dde9dab29dc723cb45db7e110bbf8cf295"} Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.125832 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.167270 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ceph\") pod \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.167333 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-pre-adoption-validation-combined-ca-bundle\") pod \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.167357 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-inventory\") pod \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.167402 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ssh-key\") pod \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.167505 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxz4\" (UniqueName: \"kubernetes.io/projected/9a8c6e87-54ee-4a74-9754-6eace44ccce0-kube-api-access-tpxz4\") pod \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\" (UID: \"9a8c6e87-54ee-4a74-9754-6eace44ccce0\") " Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.178855 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ceph" (OuterVolumeSpecName: "ceph") pod "9a8c6e87-54ee-4a74-9754-6eace44ccce0" (UID: "9a8c6e87-54ee-4a74-9754-6eace44ccce0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.179148 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "9a8c6e87-54ee-4a74-9754-6eace44ccce0" (UID: "9a8c6e87-54ee-4a74-9754-6eace44ccce0"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.179418 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a8c6e87-54ee-4a74-9754-6eace44ccce0-kube-api-access-tpxz4" (OuterVolumeSpecName: "kube-api-access-tpxz4") pod "9a8c6e87-54ee-4a74-9754-6eace44ccce0" (UID: "9a8c6e87-54ee-4a74-9754-6eace44ccce0"). InnerVolumeSpecName "kube-api-access-tpxz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.196353 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-inventory" (OuterVolumeSpecName: "inventory") pod "9a8c6e87-54ee-4a74-9754-6eace44ccce0" (UID: "9a8c6e87-54ee-4a74-9754-6eace44ccce0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.197857 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9a8c6e87-54ee-4a74-9754-6eace44ccce0" (UID: "9a8c6e87-54ee-4a74-9754-6eace44ccce0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.270499 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.270533 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxz4\" (UniqueName: \"kubernetes.io/projected/9a8c6e87-54ee-4a74-9754-6eace44ccce0-kube-api-access-tpxz4\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.270550 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.270561 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.270574 4789 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a8c6e87-54ee-4a74-9754-6eace44ccce0-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.717136 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" event={"ID":"9a8c6e87-54ee-4a74-9754-6eace44ccce0","Type":"ContainerDied","Data":"40e9841deb0975350974f5128b4383dee380f95fc1d8f35623f79a9c516870ec"} Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.717637 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40e9841deb0975350974f5128b4383dee380f95fc1d8f35623f79a9c516870ec" Dec 16 08:29:43 crc kubenswrapper[4789]: I1216 08:29:43.717173 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.987501 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q"] Dec 16 08:29:45 crc kubenswrapper[4789]: E1216 08:29:45.988301 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8c6e87-54ee-4a74-9754-6eace44ccce0" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.988322 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8c6e87-54ee-4a74-9754-6eace44ccce0" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.988545 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8c6e87-54ee-4a74-9754-6eace44ccce0" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.989420 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.991611 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.991743 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.992351 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:29:45 crc kubenswrapper[4789]: I1216 08:29:45.992526 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.022071 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q"] Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.027214 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.027461 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqt65\" (UniqueName: \"kubernetes.io/projected/18288168-e59a-407b-99e1-0a8f2a73109d-kube-api-access-nqt65\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.027619 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.027727 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.027842 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.130465 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.130803 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqt65\" (UniqueName: \"kubernetes.io/projected/18288168-e59a-407b-99e1-0a8f2a73109d-kube-api-access-nqt65\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.130939 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.130970 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.131031 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.136701 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.137438 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.138218 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.141705 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.147880 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqt65\" (UniqueName: \"kubernetes.io/projected/18288168-e59a-407b-99e1-0a8f2a73109d-kube-api-access-nqt65\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.328818 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:29:46 crc kubenswrapper[4789]: I1216 08:29:46.843952 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q"] Dec 16 08:29:47 crc kubenswrapper[4789]: I1216 08:29:47.751141 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" event={"ID":"18288168-e59a-407b-99e1-0a8f2a73109d","Type":"ContainerStarted","Data":"db198ecc9ea9582716c197fe7943ecb3e495abccf004efba7e1e863c71a3e221"} Dec 16 08:29:47 crc kubenswrapper[4789]: I1216 08:29:47.751474 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" event={"ID":"18288168-e59a-407b-99e1-0a8f2a73109d","Type":"ContainerStarted","Data":"1ccc1e0bdde25aa134220e409f63820329dfdc49dd32a3f82a158e974ddce224"} Dec 16 08:29:47 crc kubenswrapper[4789]: I1216 08:29:47.772074 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" podStartSLOduration=2.267456771 podStartE2EDuration="2.772054216s" podCreationTimestamp="2025-12-16 08:29:45 +0000 UTC" firstStartedPulling="2025-12-16 08:29:46.862235806 +0000 UTC m=+5925.124123435" lastFinishedPulling="2025-12-16 08:29:47.366833251 +0000 UTC m=+5925.628720880" observedRunningTime="2025-12-16 08:29:47.764668296 +0000 UTC m=+5926.026555935" watchObservedRunningTime="2025-12-16 08:29:47.772054216 +0000 UTC m=+5926.033941835" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.145808 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx"] Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.149321 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.152261 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.154021 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.158048 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx"] Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.225540 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5ed2f6-2519-4162-b31d-16fb006bc53d-secret-volume\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.225666 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5ed2f6-2519-4162-b31d-16fb006bc53d-config-volume\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.225812 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw2zf\" (UniqueName: \"kubernetes.io/projected/fa5ed2f6-2519-4162-b31d-16fb006bc53d-kube-api-access-hw2zf\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.331769 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5ed2f6-2519-4162-b31d-16fb006bc53d-secret-volume\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.332031 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5ed2f6-2519-4162-b31d-16fb006bc53d-config-volume\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.332394 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw2zf\" (UniqueName: \"kubernetes.io/projected/fa5ed2f6-2519-4162-b31d-16fb006bc53d-kube-api-access-hw2zf\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.333023 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5ed2f6-2519-4162-b31d-16fb006bc53d-config-volume\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.337784 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5ed2f6-2519-4162-b31d-16fb006bc53d-secret-volume\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.352877 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw2zf\" (UniqueName: \"kubernetes.io/projected/fa5ed2f6-2519-4162-b31d-16fb006bc53d-kube-api-access-hw2zf\") pod \"collect-profiles-29431230-2kfnx\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.474521 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:00 crc kubenswrapper[4789]: I1216 08:30:00.949498 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx"] Dec 16 08:30:00 crc kubenswrapper[4789]: W1216 08:30:00.953895 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa5ed2f6_2519_4162_b31d_16fb006bc53d.slice/crio-b1ed40041541b3013109bccebd809915862c4eb58d5533de5d8ecd8f56ab1d6a WatchSource:0}: Error finding container b1ed40041541b3013109bccebd809915862c4eb58d5533de5d8ecd8f56ab1d6a: Status 404 returned error can't find the container with id b1ed40041541b3013109bccebd809915862c4eb58d5533de5d8ecd8f56ab1d6a Dec 16 08:30:01 crc kubenswrapper[4789]: I1216 08:30:01.881638 4789 generic.go:334] "Generic (PLEG): container finished" podID="fa5ed2f6-2519-4162-b31d-16fb006bc53d" containerID="2722fb02af00f277d6613ad5320fcf6fcc623e5f628b7203193a965f3f709d7c" exitCode=0 Dec 16 08:30:01 crc kubenswrapper[4789]: I1216 08:30:01.882173 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" event={"ID":"fa5ed2f6-2519-4162-b31d-16fb006bc53d","Type":"ContainerDied","Data":"2722fb02af00f277d6613ad5320fcf6fcc623e5f628b7203193a965f3f709d7c"} Dec 16 08:30:01 crc kubenswrapper[4789]: I1216 08:30:01.882218 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" event={"ID":"fa5ed2f6-2519-4162-b31d-16fb006bc53d","Type":"ContainerStarted","Data":"b1ed40041541b3013109bccebd809915862c4eb58d5533de5d8ecd8f56ab1d6a"} Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.242168 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.300136 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw2zf\" (UniqueName: \"kubernetes.io/projected/fa5ed2f6-2519-4162-b31d-16fb006bc53d-kube-api-access-hw2zf\") pod \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.300337 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5ed2f6-2519-4162-b31d-16fb006bc53d-config-volume\") pod \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.300393 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5ed2f6-2519-4162-b31d-16fb006bc53d-secret-volume\") pod \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\" (UID: \"fa5ed2f6-2519-4162-b31d-16fb006bc53d\") " Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.300822 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5ed2f6-2519-4162-b31d-16fb006bc53d-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa5ed2f6-2519-4162-b31d-16fb006bc53d" (UID: "fa5ed2f6-2519-4162-b31d-16fb006bc53d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.301364 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5ed2f6-2519-4162-b31d-16fb006bc53d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.305557 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5ed2f6-2519-4162-b31d-16fb006bc53d-kube-api-access-hw2zf" (OuterVolumeSpecName: "kube-api-access-hw2zf") pod "fa5ed2f6-2519-4162-b31d-16fb006bc53d" (UID: "fa5ed2f6-2519-4162-b31d-16fb006bc53d"). InnerVolumeSpecName "kube-api-access-hw2zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.305723 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5ed2f6-2519-4162-b31d-16fb006bc53d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa5ed2f6-2519-4162-b31d-16fb006bc53d" (UID: "fa5ed2f6-2519-4162-b31d-16fb006bc53d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.403228 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5ed2f6-2519-4162-b31d-16fb006bc53d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.403275 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw2zf\" (UniqueName: \"kubernetes.io/projected/fa5ed2f6-2519-4162-b31d-16fb006bc53d-kube-api-access-hw2zf\") on node \"crc\" DevicePath \"\"" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.916783 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" event={"ID":"fa5ed2f6-2519-4162-b31d-16fb006bc53d","Type":"ContainerDied","Data":"b1ed40041541b3013109bccebd809915862c4eb58d5533de5d8ecd8f56ab1d6a"} Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.916831 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1ed40041541b3013109bccebd809915862c4eb58d5533de5d8ecd8f56ab1d6a" Dec 16 08:30:03 crc kubenswrapper[4789]: I1216 08:30:03.916854 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx" Dec 16 08:30:04 crc kubenswrapper[4789]: I1216 08:30:04.322313 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn"] Dec 16 08:30:04 crc kubenswrapper[4789]: I1216 08:30:04.334012 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431185-4t5tn"] Dec 16 08:30:06 crc kubenswrapper[4789]: I1216 08:30:06.118463 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b256111-1ac9-4f85-930e-4316e29c55fe" path="/var/lib/kubelet/pods/4b256111-1ac9-4f85-930e-4316e29c55fe/volumes" Dec 16 08:30:20 crc kubenswrapper[4789]: I1216 08:30:20.036868 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-m7qnf"] Dec 16 08:30:20 crc kubenswrapper[4789]: I1216 08:30:20.046162 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-m7qnf"] Dec 16 08:30:20 crc kubenswrapper[4789]: I1216 08:30:20.119523 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011d39c2-528c-42f0-8a97-5e3e06caa1c0" path="/var/lib/kubelet/pods/011d39c2-528c-42f0-8a97-5e3e06caa1c0/volumes" Dec 16 08:30:21 crc kubenswrapper[4789]: I1216 08:30:21.928316 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:30:21 crc kubenswrapper[4789]: I1216 08:30:21.928708 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:30:32 crc kubenswrapper[4789]: I1216 08:30:32.864004 4789 scope.go:117] "RemoveContainer" containerID="40bc6d14f1f90680efa2f89e6ad96d68bc5f0b2d972df11712cfe2e5ef89774f" Dec 16 08:30:32 crc kubenswrapper[4789]: I1216 08:30:32.929529 4789 scope.go:117] "RemoveContainer" containerID="12886505534ad285dea0ad937195d6b4c00f2e06f16c2723907d994a1152137f" Dec 16 08:30:32 crc kubenswrapper[4789]: I1216 08:30:32.967231 4789 scope.go:117] "RemoveContainer" containerID="255a38ae88936dc2f3cffa4a9fcd2c9d567efcb4527cd4fe51a755020b694393" Dec 16 08:30:33 crc kubenswrapper[4789]: I1216 08:30:33.045209 4789 scope.go:117] "RemoveContainer" containerID="895267cca919ebefdbde50b4d237657a7090e3451c3d21a98e1a922f104a929f" Dec 16 08:30:51 crc kubenswrapper[4789]: I1216 08:30:51.928164 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:30:51 crc kubenswrapper[4789]: I1216 08:30:51.928702 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:31:21 crc kubenswrapper[4789]: I1216 08:31:21.927847 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:31:21 crc kubenswrapper[4789]: I1216 08:31:21.928487 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:31:21 crc kubenswrapper[4789]: I1216 08:31:21.928533 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:31:21 crc kubenswrapper[4789]: I1216 08:31:21.929050 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:31:21 crc kubenswrapper[4789]: I1216 08:31:21.929126 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" gracePeriod=600 Dec 16 08:31:22 crc kubenswrapper[4789]: E1216 08:31:22.063841 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:31:22 crc kubenswrapper[4789]: I1216 08:31:22.602747 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" exitCode=0 Dec 16 08:31:22 crc kubenswrapper[4789]: I1216 08:31:22.602792 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d"} Dec 16 08:31:22 crc kubenswrapper[4789]: I1216 08:31:22.602825 4789 scope.go:117] "RemoveContainer" containerID="93694727a36708bd1006bf75a00f6ac1c8b551c001d91ed3b60ce8e5c8ebae39" Dec 16 08:31:22 crc kubenswrapper[4789]: I1216 08:31:22.603541 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:31:22 crc kubenswrapper[4789]: E1216 08:31:22.603842 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:31:37 crc kubenswrapper[4789]: I1216 08:31:37.104792 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:31:37 crc kubenswrapper[4789]: E1216 08:31:37.105679 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:31:51 crc kubenswrapper[4789]: I1216 08:31:51.106271 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:31:51 crc kubenswrapper[4789]: E1216 08:31:51.107461 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.344122 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.344667 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.344129 4789 patch_prober.go:28] interesting pod/console-operator-58897d9998-s57cr container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.345005 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-s57cr" podUID="c86e6908-9ec3-4e62-b9cf-86f136b1dc6a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.344216 4789 patch_prober.go:28] interesting pod/console-operator-58897d9998-s57cr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.345045 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-s57cr" podUID="c86e6908-9ec3-4e62-b9cf-86f136b1dc6a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.344245 4789 patch_prober.go:28] interesting pod/router-default-5444994796-pwj9t container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 08:31:57 crc kubenswrapper[4789]: I1216 08:31:57.345072 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-pwj9t" podUID="5221dd3a-57e8-43ff-ac08-62cbfc025419" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 08:32:04 crc kubenswrapper[4789]: I1216 08:32:04.105316 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:32:04 crc kubenswrapper[4789]: E1216 08:32:04.106012 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.432378 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7mkqg"] Dec 16 08:32:05 crc kubenswrapper[4789]: E1216 08:32:05.433310 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5ed2f6-2519-4162-b31d-16fb006bc53d" containerName="collect-profiles" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.433327 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5ed2f6-2519-4162-b31d-16fb006bc53d" containerName="collect-profiles" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.433571 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5ed2f6-2519-4162-b31d-16fb006bc53d" containerName="collect-profiles" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.435621 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.445183 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mkqg"] Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.533731 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/cea26c06-d789-4ad6-8623-f6ab72328e25-kube-api-access-wzlrf\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.535411 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-catalog-content\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.535750 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-utilities\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.637232 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/cea26c06-d789-4ad6-8623-f6ab72328e25-kube-api-access-wzlrf\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.637746 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-catalog-content\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.638389 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-catalog-content\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.638497 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-utilities\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.638830 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-utilities\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.658442 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/cea26c06-d789-4ad6-8623-f6ab72328e25-kube-api-access-wzlrf\") pod \"community-operators-7mkqg\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:05 crc kubenswrapper[4789]: I1216 08:32:05.762303 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:06 crc kubenswrapper[4789]: I1216 08:32:06.303514 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mkqg"] Dec 16 08:32:07 crc kubenswrapper[4789]: I1216 08:32:07.002399 4789 generic.go:334] "Generic (PLEG): container finished" podID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerID="ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590" exitCode=0 Dec 16 08:32:07 crc kubenswrapper[4789]: I1216 08:32:07.002605 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mkqg" event={"ID":"cea26c06-d789-4ad6-8623-f6ab72328e25","Type":"ContainerDied","Data":"ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590"} Dec 16 08:32:07 crc kubenswrapper[4789]: I1216 08:32:07.002666 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mkqg" event={"ID":"cea26c06-d789-4ad6-8623-f6ab72328e25","Type":"ContainerStarted","Data":"1bdbd3704a042670408ca7afdf2827480884cf334540b438c5a9f1f346717344"} Dec 16 08:32:07 crc kubenswrapper[4789]: I1216 08:32:07.004437 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:32:08 crc kubenswrapper[4789]: I1216 08:32:08.013885 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mkqg" event={"ID":"cea26c06-d789-4ad6-8623-f6ab72328e25","Type":"ContainerStarted","Data":"1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694"} Dec 16 08:32:09 crc kubenswrapper[4789]: I1216 08:32:09.023572 4789 generic.go:334] "Generic (PLEG): container finished" podID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerID="1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694" exitCode=0 Dec 16 08:32:09 crc kubenswrapper[4789]: I1216 08:32:09.023676 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mkqg" event={"ID":"cea26c06-d789-4ad6-8623-f6ab72328e25","Type":"ContainerDied","Data":"1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694"} Dec 16 08:32:11 crc kubenswrapper[4789]: I1216 08:32:11.043223 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mkqg" event={"ID":"cea26c06-d789-4ad6-8623-f6ab72328e25","Type":"ContainerStarted","Data":"29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0"} Dec 16 08:32:11 crc kubenswrapper[4789]: I1216 08:32:11.069604 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7mkqg" podStartSLOduration=2.989601942 podStartE2EDuration="6.069581083s" podCreationTimestamp="2025-12-16 08:32:05 +0000 UTC" firstStartedPulling="2025-12-16 08:32:07.004164902 +0000 UTC m=+6065.266052531" lastFinishedPulling="2025-12-16 08:32:10.084144043 +0000 UTC m=+6068.346031672" observedRunningTime="2025-12-16 08:32:11.059363243 +0000 UTC m=+6069.321250892" watchObservedRunningTime="2025-12-16 08:32:11.069581083 +0000 UTC m=+6069.331468712" Dec 16 08:32:15 crc kubenswrapper[4789]: I1216 08:32:15.762976 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:15 crc kubenswrapper[4789]: I1216 08:32:15.763879 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:15 crc kubenswrapper[4789]: I1216 08:32:15.824659 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:16 crc kubenswrapper[4789]: I1216 08:32:16.174737 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:16 crc kubenswrapper[4789]: I1216 08:32:16.221293 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mkqg"] Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.137201 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7mkqg" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="registry-server" containerID="cri-o://29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0" gracePeriod=2 Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.608748 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.728807 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-catalog-content\") pod \"cea26c06-d789-4ad6-8623-f6ab72328e25\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.728886 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-utilities\") pod \"cea26c06-d789-4ad6-8623-f6ab72328e25\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.728973 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/cea26c06-d789-4ad6-8623-f6ab72328e25-kube-api-access-wzlrf\") pod \"cea26c06-d789-4ad6-8623-f6ab72328e25\" (UID: \"cea26c06-d789-4ad6-8623-f6ab72328e25\") " Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.730078 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-utilities" (OuterVolumeSpecName: "utilities") pod "cea26c06-d789-4ad6-8623-f6ab72328e25" (UID: "cea26c06-d789-4ad6-8623-f6ab72328e25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.738112 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea26c06-d789-4ad6-8623-f6ab72328e25-kube-api-access-wzlrf" (OuterVolumeSpecName: "kube-api-access-wzlrf") pod "cea26c06-d789-4ad6-8623-f6ab72328e25" (UID: "cea26c06-d789-4ad6-8623-f6ab72328e25"). InnerVolumeSpecName "kube-api-access-wzlrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.781185 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cea26c06-d789-4ad6-8623-f6ab72328e25" (UID: "cea26c06-d789-4ad6-8623-f6ab72328e25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.830888 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.830945 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea26c06-d789-4ad6-8623-f6ab72328e25-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:32:18 crc kubenswrapper[4789]: I1216 08:32:18.830956 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/cea26c06-d789-4ad6-8623-f6ab72328e25-kube-api-access-wzlrf\") on node \"crc\" DevicePath \"\"" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.104938 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:32:19 crc kubenswrapper[4789]: E1216 08:32:19.105481 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.173964 4789 generic.go:334] "Generic (PLEG): container finished" podID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerID="29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0" exitCode=0 Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.174018 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mkqg" event={"ID":"cea26c06-d789-4ad6-8623-f6ab72328e25","Type":"ContainerDied","Data":"29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0"} Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.174069 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mkqg" event={"ID":"cea26c06-d789-4ad6-8623-f6ab72328e25","Type":"ContainerDied","Data":"1bdbd3704a042670408ca7afdf2827480884cf334540b438c5a9f1f346717344"} Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.174094 4789 scope.go:117] "RemoveContainer" containerID="29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.174207 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mkqg" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.208871 4789 scope.go:117] "RemoveContainer" containerID="1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.232839 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mkqg"] Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.242600 4789 scope.go:117] "RemoveContainer" containerID="ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.243254 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7mkqg"] Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.292307 4789 scope.go:117] "RemoveContainer" containerID="29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0" Dec 16 08:32:19 crc kubenswrapper[4789]: E1216 08:32:19.293247 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0\": container with ID starting with 29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0 not found: ID does not exist" containerID="29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.293320 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0"} err="failed to get container status \"29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0\": rpc error: code = NotFound desc = could not find container \"29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0\": container with ID starting with 29dce307ff22e70cb9c91ceaa5324e90221077444f0867e4746cbaa30188cdc0 not found: ID does not exist" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.293360 4789 scope.go:117] "RemoveContainer" containerID="1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694" Dec 16 08:32:19 crc kubenswrapper[4789]: E1216 08:32:19.296346 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694\": container with ID starting with 1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694 not found: ID does not exist" containerID="1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.296417 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694"} err="failed to get container status \"1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694\": rpc error: code = NotFound desc = could not find container \"1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694\": container with ID starting with 1d5ead26a4681a65563b7b60b47dee0b77b30acad759a24a5a9086f830d41694 not found: ID does not exist" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.296463 4789 scope.go:117] "RemoveContainer" containerID="ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590" Dec 16 08:32:19 crc kubenswrapper[4789]: E1216 08:32:19.297367 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590\": container with ID starting with ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590 not found: ID does not exist" containerID="ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590" Dec 16 08:32:19 crc kubenswrapper[4789]: I1216 08:32:19.297431 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590"} err="failed to get container status \"ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590\": rpc error: code = NotFound desc = could not find container \"ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590\": container with ID starting with ba0a426909caf13c3c5cf7da83c1bcd9be19621ed91334db698b402b8206f590 not found: ID does not exist" Dec 16 08:32:20 crc kubenswrapper[4789]: I1216 08:32:20.118129 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" path="/var/lib/kubelet/pods/cea26c06-d789-4ad6-8623-f6ab72328e25/volumes" Dec 16 08:32:22 crc kubenswrapper[4789]: I1216 08:32:22.979722 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkf2p"] Dec 16 08:32:22 crc kubenswrapper[4789]: E1216 08:32:22.980191 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="extract-utilities" Dec 16 08:32:22 crc kubenswrapper[4789]: I1216 08:32:22.980203 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="extract-utilities" Dec 16 08:32:22 crc kubenswrapper[4789]: E1216 08:32:22.980228 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="extract-content" Dec 16 08:32:22 crc kubenswrapper[4789]: I1216 08:32:22.980233 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="extract-content" Dec 16 08:32:22 crc kubenswrapper[4789]: E1216 08:32:22.980258 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="registry-server" Dec 16 08:32:22 crc kubenswrapper[4789]: I1216 08:32:22.980265 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="registry-server" Dec 16 08:32:22 crc kubenswrapper[4789]: I1216 08:32:22.980476 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea26c06-d789-4ad6-8623-f6ab72328e25" containerName="registry-server" Dec 16 08:32:22 crc kubenswrapper[4789]: I1216 08:32:22.982116 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:22 crc kubenswrapper[4789]: I1216 08:32:22.995546 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkf2p"] Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.119389 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77jns\" (UniqueName: \"kubernetes.io/projected/7a9d97d2-478f-4339-9a86-8b998f0b47fe-kube-api-access-77jns\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.119493 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-utilities\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.119637 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-catalog-content\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.221359 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-utilities\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.221425 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-catalog-content\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.221665 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77jns\" (UniqueName: \"kubernetes.io/projected/7a9d97d2-478f-4339-9a86-8b998f0b47fe-kube-api-access-77jns\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.222167 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-catalog-content\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.222661 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-utilities\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.247243 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77jns\" (UniqueName: \"kubernetes.io/projected/7a9d97d2-478f-4339-9a86-8b998f0b47fe-kube-api-access-77jns\") pod \"redhat-operators-xkf2p\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.302660 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:23 crc kubenswrapper[4789]: I1216 08:32:23.849458 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkf2p"] Dec 16 08:32:24 crc kubenswrapper[4789]: I1216 08:32:24.221690 4789 generic.go:334] "Generic (PLEG): container finished" podID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerID="a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5" exitCode=0 Dec 16 08:32:24 crc kubenswrapper[4789]: I1216 08:32:24.221765 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkf2p" event={"ID":"7a9d97d2-478f-4339-9a86-8b998f0b47fe","Type":"ContainerDied","Data":"a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5"} Dec 16 08:32:24 crc kubenswrapper[4789]: I1216 08:32:24.222145 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkf2p" event={"ID":"7a9d97d2-478f-4339-9a86-8b998f0b47fe","Type":"ContainerStarted","Data":"ad95f33f72e0db8e34bd173ea18dd1ea43f65bb9f732836e422ba64d04c6f84c"} Dec 16 08:32:25 crc kubenswrapper[4789]: I1216 08:32:25.234566 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkf2p" event={"ID":"7a9d97d2-478f-4339-9a86-8b998f0b47fe","Type":"ContainerStarted","Data":"fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8"} Dec 16 08:32:31 crc kubenswrapper[4789]: I1216 08:32:31.293271 4789 generic.go:334] "Generic (PLEG): container finished" podID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerID="fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8" exitCode=0 Dec 16 08:32:31 crc kubenswrapper[4789]: I1216 08:32:31.293382 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkf2p" event={"ID":"7a9d97d2-478f-4339-9a86-8b998f0b47fe","Type":"ContainerDied","Data":"fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8"} Dec 16 08:32:33 crc kubenswrapper[4789]: I1216 08:32:33.313055 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkf2p" event={"ID":"7a9d97d2-478f-4339-9a86-8b998f0b47fe","Type":"ContainerStarted","Data":"7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c"} Dec 16 08:32:33 crc kubenswrapper[4789]: I1216 08:32:33.338316 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkf2p" podStartSLOduration=3.017350381 podStartE2EDuration="11.33829661s" podCreationTimestamp="2025-12-16 08:32:22 +0000 UTC" firstStartedPulling="2025-12-16 08:32:24.223498335 +0000 UTC m=+6082.485385964" lastFinishedPulling="2025-12-16 08:32:32.544444564 +0000 UTC m=+6090.806332193" observedRunningTime="2025-12-16 08:32:33.329653079 +0000 UTC m=+6091.591540708" watchObservedRunningTime="2025-12-16 08:32:33.33829661 +0000 UTC m=+6091.600184239" Dec 16 08:32:34 crc kubenswrapper[4789]: I1216 08:32:34.105310 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:32:34 crc kubenswrapper[4789]: E1216 08:32:34.106010 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:32:43 crc kubenswrapper[4789]: I1216 08:32:43.303439 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:43 crc kubenswrapper[4789]: I1216 08:32:43.306796 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:43 crc kubenswrapper[4789]: I1216 08:32:43.350808 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:43 crc kubenswrapper[4789]: I1216 08:32:43.454330 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:43 crc kubenswrapper[4789]: I1216 08:32:43.583033 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkf2p"] Dec 16 08:32:45 crc kubenswrapper[4789]: I1216 08:32:45.426601 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkf2p" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="registry-server" containerID="cri-o://7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c" gracePeriod=2 Dec 16 08:32:45 crc kubenswrapper[4789]: I1216 08:32:45.931820 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.026347 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77jns\" (UniqueName: \"kubernetes.io/projected/7a9d97d2-478f-4339-9a86-8b998f0b47fe-kube-api-access-77jns\") pod \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.026563 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-catalog-content\") pod \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.026703 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-utilities\") pod \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\" (UID: \"7a9d97d2-478f-4339-9a86-8b998f0b47fe\") " Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.027584 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-utilities" (OuterVolumeSpecName: "utilities") pod "7a9d97d2-478f-4339-9a86-8b998f0b47fe" (UID: "7a9d97d2-478f-4339-9a86-8b998f0b47fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.032920 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9d97d2-478f-4339-9a86-8b998f0b47fe-kube-api-access-77jns" (OuterVolumeSpecName: "kube-api-access-77jns") pod "7a9d97d2-478f-4339-9a86-8b998f0b47fe" (UID: "7a9d97d2-478f-4339-9a86-8b998f0b47fe"). InnerVolumeSpecName "kube-api-access-77jns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.108772 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:32:46 crc kubenswrapper[4789]: E1216 08:32:46.109098 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.129223 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.129266 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77jns\" (UniqueName: \"kubernetes.io/projected/7a9d97d2-478f-4339-9a86-8b998f0b47fe-kube-api-access-77jns\") on node \"crc\" DevicePath \"\"" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.152504 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a9d97d2-478f-4339-9a86-8b998f0b47fe" (UID: "7a9d97d2-478f-4339-9a86-8b998f0b47fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.233070 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9d97d2-478f-4339-9a86-8b998f0b47fe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.437711 4789 generic.go:334] "Generic (PLEG): container finished" podID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerID="7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c" exitCode=0 Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.437770 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkf2p" event={"ID":"7a9d97d2-478f-4339-9a86-8b998f0b47fe","Type":"ContainerDied","Data":"7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c"} Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.437809 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkf2p" event={"ID":"7a9d97d2-478f-4339-9a86-8b998f0b47fe","Type":"ContainerDied","Data":"ad95f33f72e0db8e34bd173ea18dd1ea43f65bb9f732836e422ba64d04c6f84c"} Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.437830 4789 scope.go:117] "RemoveContainer" containerID="7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.437863 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkf2p" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.462757 4789 scope.go:117] "RemoveContainer" containerID="fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.484121 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkf2p"] Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.491761 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkf2p"] Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.517509 4789 scope.go:117] "RemoveContainer" containerID="a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.541287 4789 scope.go:117] "RemoveContainer" containerID="7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c" Dec 16 08:32:46 crc kubenswrapper[4789]: E1216 08:32:46.541835 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c\": container with ID starting with 7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c not found: ID does not exist" containerID="7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.541886 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c"} err="failed to get container status \"7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c\": rpc error: code = NotFound desc = could not find container \"7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c\": container with ID starting with 7cf74d5d520e3416fc7a736b54f83090998ab97429a9382153911ddc17a1806c not found: ID does not exist" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.541932 4789 scope.go:117] "RemoveContainer" containerID="fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8" Dec 16 08:32:46 crc kubenswrapper[4789]: E1216 08:32:46.542255 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8\": container with ID starting with fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8 not found: ID does not exist" containerID="fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.542297 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8"} err="failed to get container status \"fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8\": rpc error: code = NotFound desc = could not find container \"fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8\": container with ID starting with fab79c4bd9d401da30423ea93616953dc8b0d2c3f2a549d9d40a40b8408e60e8 not found: ID does not exist" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.542325 4789 scope.go:117] "RemoveContainer" containerID="a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5" Dec 16 08:32:46 crc kubenswrapper[4789]: E1216 08:32:46.542623 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5\": container with ID starting with a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5 not found: ID does not exist" containerID="a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5" Dec 16 08:32:46 crc kubenswrapper[4789]: I1216 08:32:46.542659 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5"} err="failed to get container status \"a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5\": rpc error: code = NotFound desc = could not find container \"a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5\": container with ID starting with a04c828438c1fd0d970aad623f4c553a281cc7fbf0496e00cd48dfbb60e261d5 not found: ID does not exist" Dec 16 08:32:48 crc kubenswrapper[4789]: I1216 08:32:48.116991 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" path="/var/lib/kubelet/pods/7a9d97d2-478f-4339-9a86-8b998f0b47fe/volumes" Dec 16 08:33:01 crc kubenswrapper[4789]: I1216 08:33:01.105879 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:33:01 crc kubenswrapper[4789]: E1216 08:33:01.106829 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:33:15 crc kubenswrapper[4789]: I1216 08:33:15.105407 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:33:15 crc kubenswrapper[4789]: E1216 08:33:15.106144 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:33:30 crc kubenswrapper[4789]: I1216 08:33:30.105595 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:33:30 crc kubenswrapper[4789]: E1216 08:33:30.106349 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:33:42 crc kubenswrapper[4789]: I1216 08:33:42.111728 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:33:42 crc kubenswrapper[4789]: E1216 08:33:42.112583 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:33:56 crc kubenswrapper[4789]: I1216 08:33:56.106076 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:33:56 crc kubenswrapper[4789]: E1216 08:33:56.107413 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:34:08 crc kubenswrapper[4789]: I1216 08:34:08.105864 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:34:08 crc kubenswrapper[4789]: E1216 08:34:08.108001 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:34:10 crc kubenswrapper[4789]: I1216 08:34:10.058477 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-70bf-account-create-update-rcjlx"] Dec 16 08:34:10 crc kubenswrapper[4789]: I1216 08:34:10.067062 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-n2c9f"] Dec 16 08:34:10 crc kubenswrapper[4789]: I1216 08:34:10.075073 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-n2c9f"] Dec 16 08:34:10 crc kubenswrapper[4789]: I1216 08:34:10.083535 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-70bf-account-create-update-rcjlx"] Dec 16 08:34:10 crc kubenswrapper[4789]: I1216 08:34:10.116116 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493ed487-62fd-429c-bbef-2e2a28daa9f5" path="/var/lib/kubelet/pods/493ed487-62fd-429c-bbef-2e2a28daa9f5/volumes" Dec 16 08:34:10 crc kubenswrapper[4789]: I1216 08:34:10.116701 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e697d55c-bf66-4e4a-a68d-150bfd848aeb" path="/var/lib/kubelet/pods/e697d55c-bf66-4e4a-a68d-150bfd848aeb/volumes" Dec 16 08:34:19 crc kubenswrapper[4789]: I1216 08:34:19.105343 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:34:19 crc kubenswrapper[4789]: E1216 08:34:19.106192 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.392018 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-46fv6"] Dec 16 08:34:23 crc kubenswrapper[4789]: E1216 08:34:23.392866 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="extract-content" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.392883 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="extract-content" Dec 16 08:34:23 crc kubenswrapper[4789]: E1216 08:34:23.392903 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="registry-server" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.392927 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="registry-server" Dec 16 08:34:23 crc kubenswrapper[4789]: E1216 08:34:23.392943 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="extract-utilities" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.392949 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="extract-utilities" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.393163 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9d97d2-478f-4339-9a86-8b998f0b47fe" containerName="registry-server" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.394739 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.403038 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46fv6"] Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.480262 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff5n8\" (UniqueName: \"kubernetes.io/projected/87217ba4-8fbe-4d75-9919-cf566462b00d-kube-api-access-ff5n8\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.480681 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-utilities\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.480822 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-catalog-content\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.583513 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-utilities\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.583617 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-catalog-content\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.583735 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff5n8\" (UniqueName: \"kubernetes.io/projected/87217ba4-8fbe-4d75-9919-cf566462b00d-kube-api-access-ff5n8\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.584087 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-utilities\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.584160 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-catalog-content\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.608893 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff5n8\" (UniqueName: \"kubernetes.io/projected/87217ba4-8fbe-4d75-9919-cf566462b00d-kube-api-access-ff5n8\") pod \"certified-operators-46fv6\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:23 crc kubenswrapper[4789]: I1216 08:34:23.715008 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:24 crc kubenswrapper[4789]: I1216 08:34:24.307770 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46fv6"] Dec 16 08:34:25 crc kubenswrapper[4789]: I1216 08:34:25.335769 4789 generic.go:334] "Generic (PLEG): container finished" podID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerID="25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26" exitCode=0 Dec 16 08:34:25 crc kubenswrapper[4789]: I1216 08:34:25.335839 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46fv6" event={"ID":"87217ba4-8fbe-4d75-9919-cf566462b00d","Type":"ContainerDied","Data":"25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26"} Dec 16 08:34:25 crc kubenswrapper[4789]: I1216 08:34:25.336100 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46fv6" event={"ID":"87217ba4-8fbe-4d75-9919-cf566462b00d","Type":"ContainerStarted","Data":"040dea9e5a2faf4b8d81e3d8be0fb6d4ae1b795afdc40991d98b309a2eb27b58"} Dec 16 08:34:26 crc kubenswrapper[4789]: I1216 08:34:26.345948 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46fv6" event={"ID":"87217ba4-8fbe-4d75-9919-cf566462b00d","Type":"ContainerStarted","Data":"c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a"} Dec 16 08:34:27 crc kubenswrapper[4789]: I1216 08:34:27.043738 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-nv87k"] Dec 16 08:34:27 crc kubenswrapper[4789]: I1216 08:34:27.051425 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-nv87k"] Dec 16 08:34:27 crc kubenswrapper[4789]: I1216 08:34:27.359416 4789 generic.go:334] "Generic (PLEG): container finished" podID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerID="c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a" exitCode=0 Dec 16 08:34:27 crc kubenswrapper[4789]: I1216 08:34:27.359527 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46fv6" event={"ID":"87217ba4-8fbe-4d75-9919-cf566462b00d","Type":"ContainerDied","Data":"c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a"} Dec 16 08:34:28 crc kubenswrapper[4789]: I1216 08:34:28.118524 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7120d155-aee7-4268-ab9f-f3adc640fb88" path="/var/lib/kubelet/pods/7120d155-aee7-4268-ab9f-f3adc640fb88/volumes" Dec 16 08:34:29 crc kubenswrapper[4789]: I1216 08:34:29.379324 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46fv6" event={"ID":"87217ba4-8fbe-4d75-9919-cf566462b00d","Type":"ContainerStarted","Data":"ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3"} Dec 16 08:34:29 crc kubenswrapper[4789]: I1216 08:34:29.407939 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-46fv6" podStartSLOduration=3.472442982 podStartE2EDuration="6.40789837s" podCreationTimestamp="2025-12-16 08:34:23 +0000 UTC" firstStartedPulling="2025-12-16 08:34:25.338885641 +0000 UTC m=+6203.600773270" lastFinishedPulling="2025-12-16 08:34:28.274341009 +0000 UTC m=+6206.536228658" observedRunningTime="2025-12-16 08:34:29.398304235 +0000 UTC m=+6207.660191874" watchObservedRunningTime="2025-12-16 08:34:29.40789837 +0000 UTC m=+6207.669786019" Dec 16 08:34:33 crc kubenswrapper[4789]: I1216 08:34:33.105994 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:34:33 crc kubenswrapper[4789]: E1216 08:34:33.107374 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:34:33 crc kubenswrapper[4789]: I1216 08:34:33.304080 4789 scope.go:117] "RemoveContainer" containerID="935a814bb7c56e1f0690250e92a1bc1c636355763b4dec39b6531540696220d8" Dec 16 08:34:33 crc kubenswrapper[4789]: I1216 08:34:33.348495 4789 scope.go:117] "RemoveContainer" containerID="7c5c96973b48500fdd950a9c9328fabecb981ca9822e3eb43681bc4897e94864" Dec 16 08:34:33 crc kubenswrapper[4789]: I1216 08:34:33.398884 4789 scope.go:117] "RemoveContainer" containerID="523d879fcb458f3047ee7911321cdd465ac7e87b987562f12e3689cec379fdfc" Dec 16 08:34:33 crc kubenswrapper[4789]: I1216 08:34:33.715485 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:33 crc kubenswrapper[4789]: I1216 08:34:33.715564 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:33 crc kubenswrapper[4789]: I1216 08:34:33.770740 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:34 crc kubenswrapper[4789]: I1216 08:34:34.512895 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:34 crc kubenswrapper[4789]: I1216 08:34:34.567536 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46fv6"] Dec 16 08:34:36 crc kubenswrapper[4789]: I1216 08:34:36.471128 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-46fv6" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="registry-server" containerID="cri-o://ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3" gracePeriod=2 Dec 16 08:34:36 crc kubenswrapper[4789]: I1216 08:34:36.929527 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.050332 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-catalog-content\") pod \"87217ba4-8fbe-4d75-9919-cf566462b00d\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.050733 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-utilities\") pod \"87217ba4-8fbe-4d75-9919-cf566462b00d\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.050877 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff5n8\" (UniqueName: \"kubernetes.io/projected/87217ba4-8fbe-4d75-9919-cf566462b00d-kube-api-access-ff5n8\") pod \"87217ba4-8fbe-4d75-9919-cf566462b00d\" (UID: \"87217ba4-8fbe-4d75-9919-cf566462b00d\") " Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.051545 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-utilities" (OuterVolumeSpecName: "utilities") pod "87217ba4-8fbe-4d75-9919-cf566462b00d" (UID: "87217ba4-8fbe-4d75-9919-cf566462b00d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.051803 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.056049 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87217ba4-8fbe-4d75-9919-cf566462b00d-kube-api-access-ff5n8" (OuterVolumeSpecName: "kube-api-access-ff5n8") pod "87217ba4-8fbe-4d75-9919-cf566462b00d" (UID: "87217ba4-8fbe-4d75-9919-cf566462b00d"). InnerVolumeSpecName "kube-api-access-ff5n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.096089 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87217ba4-8fbe-4d75-9919-cf566462b00d" (UID: "87217ba4-8fbe-4d75-9919-cf566462b00d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.153756 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87217ba4-8fbe-4d75-9919-cf566462b00d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.153797 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff5n8\" (UniqueName: \"kubernetes.io/projected/87217ba4-8fbe-4d75-9919-cf566462b00d-kube-api-access-ff5n8\") on node \"crc\" DevicePath \"\"" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.482327 4789 generic.go:334] "Generic (PLEG): container finished" podID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerID="ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3" exitCode=0 Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.482373 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46fv6" event={"ID":"87217ba4-8fbe-4d75-9919-cf566462b00d","Type":"ContainerDied","Data":"ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3"} Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.482409 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46fv6" event={"ID":"87217ba4-8fbe-4d75-9919-cf566462b00d","Type":"ContainerDied","Data":"040dea9e5a2faf4b8d81e3d8be0fb6d4ae1b795afdc40991d98b309a2eb27b58"} Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.482428 4789 scope.go:117] "RemoveContainer" containerID="ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.482440 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46fv6" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.515633 4789 scope.go:117] "RemoveContainer" containerID="c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.525007 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46fv6"] Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.534306 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-46fv6"] Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.551877 4789 scope.go:117] "RemoveContainer" containerID="25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.583358 4789 scope.go:117] "RemoveContainer" containerID="ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3" Dec 16 08:34:37 crc kubenswrapper[4789]: E1216 08:34:37.583838 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3\": container with ID starting with ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3 not found: ID does not exist" containerID="ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.583866 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3"} err="failed to get container status \"ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3\": rpc error: code = NotFound desc = could not find container \"ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3\": container with ID starting with ca81eee2266541d1c90f50737e43a82d0b63e108d3fe4020a1bf689d292929d3 not found: ID does not exist" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.583886 4789 scope.go:117] "RemoveContainer" containerID="c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a" Dec 16 08:34:37 crc kubenswrapper[4789]: E1216 08:34:37.584647 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a\": container with ID starting with c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a not found: ID does not exist" containerID="c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.584687 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a"} err="failed to get container status \"c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a\": rpc error: code = NotFound desc = could not find container \"c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a\": container with ID starting with c7f85c6f11aad76455db5d87315b496bfada515e50a69731e8176b59f932602a not found: ID does not exist" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.584716 4789 scope.go:117] "RemoveContainer" containerID="25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26" Dec 16 08:34:37 crc kubenswrapper[4789]: E1216 08:34:37.585266 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26\": container with ID starting with 25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26 not found: ID does not exist" containerID="25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26" Dec 16 08:34:37 crc kubenswrapper[4789]: I1216 08:34:37.585291 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26"} err="failed to get container status \"25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26\": rpc error: code = NotFound desc = could not find container \"25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26\": container with ID starting with 25ee7e99a915d0b856319456f31b5abc145875ab3a29b19c615b0c3f2cadfb26 not found: ID does not exist" Dec 16 08:34:38 crc kubenswrapper[4789]: I1216 08:34:38.118564 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" path="/var/lib/kubelet/pods/87217ba4-8fbe-4d75-9919-cf566462b00d/volumes" Dec 16 08:34:47 crc kubenswrapper[4789]: I1216 08:34:47.105162 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:34:47 crc kubenswrapper[4789]: E1216 08:34:47.107865 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:35:01 crc kubenswrapper[4789]: I1216 08:35:01.105794 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:35:01 crc kubenswrapper[4789]: E1216 08:35:01.107866 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:35:16 crc kubenswrapper[4789]: I1216 08:35:16.105732 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:35:16 crc kubenswrapper[4789]: E1216 08:35:16.106461 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:35:28 crc kubenswrapper[4789]: I1216 08:35:28.105608 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:35:28 crc kubenswrapper[4789]: E1216 08:35:28.106456 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:35:40 crc kubenswrapper[4789]: I1216 08:35:40.105198 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:35:40 crc kubenswrapper[4789]: E1216 08:35:40.106010 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:35:52 crc kubenswrapper[4789]: I1216 08:35:52.111040 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:35:52 crc kubenswrapper[4789]: E1216 08:35:52.111679 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:36:04 crc kubenswrapper[4789]: I1216 08:36:04.105007 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:36:04 crc kubenswrapper[4789]: E1216 08:36:04.105697 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:36:17 crc kubenswrapper[4789]: I1216 08:36:17.104902 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:36:17 crc kubenswrapper[4789]: E1216 08:36:17.105702 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:36:31 crc kubenswrapper[4789]: I1216 08:36:31.104809 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:36:31 crc kubenswrapper[4789]: I1216 08:36:31.512395 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"e96559da65a9a61ec50ea75abb21de6d4bc35a43a9cc3128626162b522be7ef7"} Dec 16 08:36:44 crc kubenswrapper[4789]: I1216 08:36:44.042197 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-f8mvk"] Dec 16 08:36:44 crc kubenswrapper[4789]: I1216 08:36:44.052121 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-f8mvk"] Dec 16 08:36:44 crc kubenswrapper[4789]: I1216 08:36:44.116849 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcd73d42-7e0f-4b63-9f9e-cd042b827fe3" path="/var/lib/kubelet/pods/bcd73d42-7e0f-4b63-9f9e-cd042b827fe3/volumes" Dec 16 08:36:45 crc kubenswrapper[4789]: I1216 08:36:45.032516 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-61a3-account-create-update-vc728"] Dec 16 08:36:45 crc kubenswrapper[4789]: I1216 08:36:45.041131 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-61a3-account-create-update-vc728"] Dec 16 08:36:46 crc kubenswrapper[4789]: I1216 08:36:46.117219 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c071c8a0-c255-478a-aceb-305f7c8139a5" path="/var/lib/kubelet/pods/c071c8a0-c255-478a-aceb-305f7c8139a5/volumes" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.422805 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2wg6r"] Dec 16 08:36:48 crc kubenswrapper[4789]: E1216 08:36:48.423678 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="extract-content" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.423693 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="extract-content" Dec 16 08:36:48 crc kubenswrapper[4789]: E1216 08:36:48.423711 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="registry-server" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.423717 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="registry-server" Dec 16 08:36:48 crc kubenswrapper[4789]: E1216 08:36:48.423734 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="extract-utilities" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.423740 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="extract-utilities" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.423975 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="87217ba4-8fbe-4d75-9919-cf566462b00d" containerName="registry-server" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.425671 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.440016 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2wg6r"] Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.618587 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9sw8\" (UniqueName: \"kubernetes.io/projected/72839d83-a021-43c2-91e3-0979cfc4b788-kube-api-access-k9sw8\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.619146 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-utilities\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.619424 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-catalog-content\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.721736 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-utilities\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.722200 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-utilities\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.722346 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-catalog-content\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.722608 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-catalog-content\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.722660 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9sw8\" (UniqueName: \"kubernetes.io/projected/72839d83-a021-43c2-91e3-0979cfc4b788-kube-api-access-k9sw8\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:48 crc kubenswrapper[4789]: I1216 08:36:48.746305 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9sw8\" (UniqueName: \"kubernetes.io/projected/72839d83-a021-43c2-91e3-0979cfc4b788-kube-api-access-k9sw8\") pod \"redhat-marketplace-2wg6r\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:49 crc kubenswrapper[4789]: I1216 08:36:49.044859 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:49 crc kubenswrapper[4789]: I1216 08:36:49.512534 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2wg6r"] Dec 16 08:36:49 crc kubenswrapper[4789]: I1216 08:36:49.688811 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2wg6r" event={"ID":"72839d83-a021-43c2-91e3-0979cfc4b788","Type":"ContainerStarted","Data":"accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa"} Dec 16 08:36:49 crc kubenswrapper[4789]: I1216 08:36:49.689256 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2wg6r" event={"ID":"72839d83-a021-43c2-91e3-0979cfc4b788","Type":"ContainerStarted","Data":"99be98fcb204d1a0267ffbdbba4ada4bb2678ea434562dfa294e7671ac3dfbfd"} Dec 16 08:36:50 crc kubenswrapper[4789]: I1216 08:36:50.699326 4789 generic.go:334] "Generic (PLEG): container finished" podID="72839d83-a021-43c2-91e3-0979cfc4b788" containerID="accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa" exitCode=0 Dec 16 08:36:50 crc kubenswrapper[4789]: I1216 08:36:50.699379 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2wg6r" event={"ID":"72839d83-a021-43c2-91e3-0979cfc4b788","Type":"ContainerDied","Data":"accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa"} Dec 16 08:36:52 crc kubenswrapper[4789]: I1216 08:36:52.726692 4789 generic.go:334] "Generic (PLEG): container finished" podID="72839d83-a021-43c2-91e3-0979cfc4b788" containerID="3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c" exitCode=0 Dec 16 08:36:52 crc kubenswrapper[4789]: I1216 08:36:52.726786 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2wg6r" event={"ID":"72839d83-a021-43c2-91e3-0979cfc4b788","Type":"ContainerDied","Data":"3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c"} Dec 16 08:36:53 crc kubenswrapper[4789]: I1216 08:36:53.737800 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2wg6r" event={"ID":"72839d83-a021-43c2-91e3-0979cfc4b788","Type":"ContainerStarted","Data":"599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7"} Dec 16 08:36:53 crc kubenswrapper[4789]: I1216 08:36:53.762666 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2wg6r" podStartSLOduration=3.16639012 podStartE2EDuration="5.762646569s" podCreationTimestamp="2025-12-16 08:36:48 +0000 UTC" firstStartedPulling="2025-12-16 08:36:50.70156066 +0000 UTC m=+6348.963448289" lastFinishedPulling="2025-12-16 08:36:53.297817109 +0000 UTC m=+6351.559704738" observedRunningTime="2025-12-16 08:36:53.754144971 +0000 UTC m=+6352.016032590" watchObservedRunningTime="2025-12-16 08:36:53.762646569 +0000 UTC m=+6352.024534218" Dec 16 08:36:59 crc kubenswrapper[4789]: I1216 08:36:59.045877 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:59 crc kubenswrapper[4789]: I1216 08:36:59.047335 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:59 crc kubenswrapper[4789]: I1216 08:36:59.094198 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:59 crc kubenswrapper[4789]: I1216 08:36:59.839124 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:36:59 crc kubenswrapper[4789]: I1216 08:36:59.891475 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2wg6r"] Dec 16 08:37:01 crc kubenswrapper[4789]: I1216 08:37:01.038806 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-bv2lw"] Dec 16 08:37:01 crc kubenswrapper[4789]: I1216 08:37:01.048078 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-bv2lw"] Dec 16 08:37:01 crc kubenswrapper[4789]: I1216 08:37:01.806768 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2wg6r" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="registry-server" containerID="cri-o://599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7" gracePeriod=2 Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.131595 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc1a722-c545-4ed5-b458-e8c9864a1e19" path="/var/lib/kubelet/pods/8dc1a722-c545-4ed5-b458-e8c9864a1e19/volumes" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.397233 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.507207 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-utilities\") pod \"72839d83-a021-43c2-91e3-0979cfc4b788\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.507406 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-catalog-content\") pod \"72839d83-a021-43c2-91e3-0979cfc4b788\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.507523 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9sw8\" (UniqueName: \"kubernetes.io/projected/72839d83-a021-43c2-91e3-0979cfc4b788-kube-api-access-k9sw8\") pod \"72839d83-a021-43c2-91e3-0979cfc4b788\" (UID: \"72839d83-a021-43c2-91e3-0979cfc4b788\") " Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.507963 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-utilities" (OuterVolumeSpecName: "utilities") pod "72839d83-a021-43c2-91e3-0979cfc4b788" (UID: "72839d83-a021-43c2-91e3-0979cfc4b788"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.513081 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72839d83-a021-43c2-91e3-0979cfc4b788-kube-api-access-k9sw8" (OuterVolumeSpecName: "kube-api-access-k9sw8") pod "72839d83-a021-43c2-91e3-0979cfc4b788" (UID: "72839d83-a021-43c2-91e3-0979cfc4b788"). InnerVolumeSpecName "kube-api-access-k9sw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.529332 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72839d83-a021-43c2-91e3-0979cfc4b788" (UID: "72839d83-a021-43c2-91e3-0979cfc4b788"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.609981 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.610013 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72839d83-a021-43c2-91e3-0979cfc4b788-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.610023 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9sw8\" (UniqueName: \"kubernetes.io/projected/72839d83-a021-43c2-91e3-0979cfc4b788-kube-api-access-k9sw8\") on node \"crc\" DevicePath \"\"" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.816326 4789 generic.go:334] "Generic (PLEG): container finished" podID="72839d83-a021-43c2-91e3-0979cfc4b788" containerID="599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7" exitCode=0 Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.816379 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2wg6r" event={"ID":"72839d83-a021-43c2-91e3-0979cfc4b788","Type":"ContainerDied","Data":"599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7"} Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.816419 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2wg6r" event={"ID":"72839d83-a021-43c2-91e3-0979cfc4b788","Type":"ContainerDied","Data":"99be98fcb204d1a0267ffbdbba4ada4bb2678ea434562dfa294e7671ac3dfbfd"} Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.816439 4789 scope.go:117] "RemoveContainer" containerID="599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.816446 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2wg6r" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.836316 4789 scope.go:117] "RemoveContainer" containerID="3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.852283 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2wg6r"] Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.863274 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2wg6r"] Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.870620 4789 scope.go:117] "RemoveContainer" containerID="accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.905758 4789 scope.go:117] "RemoveContainer" containerID="599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7" Dec 16 08:37:02 crc kubenswrapper[4789]: E1216 08:37:02.906180 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7\": container with ID starting with 599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7 not found: ID does not exist" containerID="599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.906235 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7"} err="failed to get container status \"599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7\": rpc error: code = NotFound desc = could not find container \"599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7\": container with ID starting with 599b87d8edbdc8c980b3a6316ddc26c67aa885123adf4b919cba8128de04b6a7 not found: ID does not exist" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.906266 4789 scope.go:117] "RemoveContainer" containerID="3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c" Dec 16 08:37:02 crc kubenswrapper[4789]: E1216 08:37:02.906539 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c\": container with ID starting with 3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c not found: ID does not exist" containerID="3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.906567 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c"} err="failed to get container status \"3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c\": rpc error: code = NotFound desc = could not find container \"3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c\": container with ID starting with 3554edd325da54abba83d1330b9d5241ddeb8d66666af23b964718f5d0c65c2c not found: ID does not exist" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.906582 4789 scope.go:117] "RemoveContainer" containerID="accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa" Dec 16 08:37:02 crc kubenswrapper[4789]: E1216 08:37:02.906774 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa\": container with ID starting with accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa not found: ID does not exist" containerID="accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa" Dec 16 08:37:02 crc kubenswrapper[4789]: I1216 08:37:02.906796 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa"} err="failed to get container status \"accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa\": rpc error: code = NotFound desc = could not find container \"accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa\": container with ID starting with accc02dec6bb39836aedc0e050ed6630dab693bbc98c06b9b07c3881713b7efa not found: ID does not exist" Dec 16 08:37:04 crc kubenswrapper[4789]: I1216 08:37:04.116662 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" path="/var/lib/kubelet/pods/72839d83-a021-43c2-91e3-0979cfc4b788/volumes" Dec 16 08:37:19 crc kubenswrapper[4789]: I1216 08:37:19.030822 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-gnbxf"] Dec 16 08:37:19 crc kubenswrapper[4789]: I1216 08:37:19.044078 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-d8d3-account-create-update-646vd"] Dec 16 08:37:19 crc kubenswrapper[4789]: I1216 08:37:19.060075 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-gnbxf"] Dec 16 08:37:19 crc kubenswrapper[4789]: I1216 08:37:19.080552 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-d8d3-account-create-update-646vd"] Dec 16 08:37:20 crc kubenswrapper[4789]: I1216 08:37:20.117498 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117cb35b-e746-480c-bec0-65c7307f3dc2" path="/var/lib/kubelet/pods/117cb35b-e746-480c-bec0-65c7307f3dc2/volumes" Dec 16 08:37:20 crc kubenswrapper[4789]: I1216 08:37:20.118155 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde5c8bd-5cc0-4f13-acb8-efe1c8560202" path="/var/lib/kubelet/pods/bde5c8bd-5cc0-4f13-acb8-efe1c8560202/volumes" Dec 16 08:37:32 crc kubenswrapper[4789]: I1216 08:37:32.042145 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-xf5wn"] Dec 16 08:37:32 crc kubenswrapper[4789]: I1216 08:37:32.050699 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-xf5wn"] Dec 16 08:37:32 crc kubenswrapper[4789]: I1216 08:37:32.116370 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a044ea9-86b0-4a61-b6b9-55f3663f1d3a" path="/var/lib/kubelet/pods/1a044ea9-86b0-4a61-b6b9-55f3663f1d3a/volumes" Dec 16 08:37:33 crc kubenswrapper[4789]: I1216 08:37:33.544761 4789 scope.go:117] "RemoveContainer" containerID="970e38dd654fd25c83865011e1140c028e03554f3fe0f84c3365a6dfd3c1498b" Dec 16 08:37:33 crc kubenswrapper[4789]: I1216 08:37:33.570627 4789 scope.go:117] "RemoveContainer" containerID="6627f3388c5adde55de17df45e4839aa21d7d5b9c2f9f471a05c7e4f258a1282" Dec 16 08:37:33 crc kubenswrapper[4789]: I1216 08:37:33.622453 4789 scope.go:117] "RemoveContainer" containerID="52f54f0bab32d34879fc2cb96a8e08be29093809c7cecc771e40d6362b3548fd" Dec 16 08:37:33 crc kubenswrapper[4789]: I1216 08:37:33.673852 4789 scope.go:117] "RemoveContainer" containerID="83b7e30363c240633f6e9c730caca7e2cf5e9acbc70da3518a11409db100ca54" Dec 16 08:37:33 crc kubenswrapper[4789]: I1216 08:37:33.719889 4789 scope.go:117] "RemoveContainer" containerID="f9cba574f842f45fefa861d72dbe686297c23958516ac88a28f8366d0364219a" Dec 16 08:37:33 crc kubenswrapper[4789]: I1216 08:37:33.779266 4789 scope.go:117] "RemoveContainer" containerID="6e3986ef0345e0cc99bacdcdd2db04d15c36b734615cfa4ffbf4ab212107d50e" Dec 16 08:38:51 crc kubenswrapper[4789]: I1216 08:38:51.927989 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:38:51 crc kubenswrapper[4789]: I1216 08:38:51.929647 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:39:21 crc kubenswrapper[4789]: I1216 08:39:21.927941 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:39:21 crc kubenswrapper[4789]: I1216 08:39:21.928459 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:39:51 crc kubenswrapper[4789]: I1216 08:39:51.927464 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:39:51 crc kubenswrapper[4789]: I1216 08:39:51.927992 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:39:51 crc kubenswrapper[4789]: I1216 08:39:51.928051 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:39:51 crc kubenswrapper[4789]: I1216 08:39:51.928886 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e96559da65a9a61ec50ea75abb21de6d4bc35a43a9cc3128626162b522be7ef7"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:39:51 crc kubenswrapper[4789]: I1216 08:39:51.928948 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://e96559da65a9a61ec50ea75abb21de6d4bc35a43a9cc3128626162b522be7ef7" gracePeriod=600 Dec 16 08:39:52 crc kubenswrapper[4789]: I1216 08:39:52.388610 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="e96559da65a9a61ec50ea75abb21de6d4bc35a43a9cc3128626162b522be7ef7" exitCode=0 Dec 16 08:39:52 crc kubenswrapper[4789]: I1216 08:39:52.388895 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"e96559da65a9a61ec50ea75abb21de6d4bc35a43a9cc3128626162b522be7ef7"} Dec 16 08:39:52 crc kubenswrapper[4789]: I1216 08:39:52.388945 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7"} Dec 16 08:39:52 crc kubenswrapper[4789]: I1216 08:39:52.388962 4789 scope.go:117] "RemoveContainer" containerID="c575c51b89cd91f72bf66c6c810794acd4cca057e64d20cb54f23bd3d63f813d" Dec 16 08:40:09 crc kubenswrapper[4789]: I1216 08:40:09.537248 4789 generic.go:334] "Generic (PLEG): container finished" podID="18288168-e59a-407b-99e1-0a8f2a73109d" containerID="db198ecc9ea9582716c197fe7943ecb3e495abccf004efba7e1e863c71a3e221" exitCode=0 Dec 16 08:40:09 crc kubenswrapper[4789]: I1216 08:40:09.537510 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" event={"ID":"18288168-e59a-407b-99e1-0a8f2a73109d","Type":"ContainerDied","Data":"db198ecc9ea9582716c197fe7943ecb3e495abccf004efba7e1e863c71a3e221"} Dec 16 08:40:10 crc kubenswrapper[4789]: I1216 08:40:10.961874 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.059871 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-inventory\") pod \"18288168-e59a-407b-99e1-0a8f2a73109d\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.059939 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ssh-key\") pod \"18288168-e59a-407b-99e1-0a8f2a73109d\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.059978 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqt65\" (UniqueName: \"kubernetes.io/projected/18288168-e59a-407b-99e1-0a8f2a73109d-kube-api-access-nqt65\") pod \"18288168-e59a-407b-99e1-0a8f2a73109d\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.060002 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ceph\") pod \"18288168-e59a-407b-99e1-0a8f2a73109d\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.060030 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-tripleo-cleanup-combined-ca-bundle\") pod \"18288168-e59a-407b-99e1-0a8f2a73109d\" (UID: \"18288168-e59a-407b-99e1-0a8f2a73109d\") " Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.065391 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "18288168-e59a-407b-99e1-0a8f2a73109d" (UID: "18288168-e59a-407b-99e1-0a8f2a73109d"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.065772 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18288168-e59a-407b-99e1-0a8f2a73109d-kube-api-access-nqt65" (OuterVolumeSpecName: "kube-api-access-nqt65") pod "18288168-e59a-407b-99e1-0a8f2a73109d" (UID: "18288168-e59a-407b-99e1-0a8f2a73109d"). InnerVolumeSpecName "kube-api-access-nqt65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.066430 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ceph" (OuterVolumeSpecName: "ceph") pod "18288168-e59a-407b-99e1-0a8f2a73109d" (UID: "18288168-e59a-407b-99e1-0a8f2a73109d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.086238 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18288168-e59a-407b-99e1-0a8f2a73109d" (UID: "18288168-e59a-407b-99e1-0a8f2a73109d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.093028 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-inventory" (OuterVolumeSpecName: "inventory") pod "18288168-e59a-407b-99e1-0a8f2a73109d" (UID: "18288168-e59a-407b-99e1-0a8f2a73109d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.162948 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.162986 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.163033 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqt65\" (UniqueName: \"kubernetes.io/projected/18288168-e59a-407b-99e1-0a8f2a73109d-kube-api-access-nqt65\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.163047 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.163059 4789 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18288168-e59a-407b-99e1-0a8f2a73109d-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.555736 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" event={"ID":"18288168-e59a-407b-99e1-0a8f2a73109d","Type":"ContainerDied","Data":"1ccc1e0bdde25aa134220e409f63820329dfdc49dd32a3f82a158e974ddce224"} Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.555814 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ccc1e0bdde25aa134220e409f63820329dfdc49dd32a3f82a158e974ddce224" Dec 16 08:40:11 crc kubenswrapper[4789]: I1216 08:40:11.555881 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.729823 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zf49z"] Dec 16 08:40:14 crc kubenswrapper[4789]: E1216 08:40:14.731151 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="extract-content" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.731169 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="extract-content" Dec 16 08:40:14 crc kubenswrapper[4789]: E1216 08:40:14.731185 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18288168-e59a-407b-99e1-0a8f2a73109d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.731195 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="18288168-e59a-407b-99e1-0a8f2a73109d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 16 08:40:14 crc kubenswrapper[4789]: E1216 08:40:14.731209 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="registry-server" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.731217 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="registry-server" Dec 16 08:40:14 crc kubenswrapper[4789]: E1216 08:40:14.731253 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="extract-utilities" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.731262 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="extract-utilities" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.731536 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="18288168-e59a-407b-99e1-0a8f2a73109d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.731557 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="72839d83-a021-43c2-91e3-0979cfc4b788" containerName="registry-server" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.732434 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.738178 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.738178 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.738366 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.738366 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.742675 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zf49z"] Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.751144 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88ht7\" (UniqueName: \"kubernetes.io/projected/d5635ad5-e918-492f-b2b3-8e8893ba73e1-kube-api-access-88ht7\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.751201 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.751330 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.751461 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ceph\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.751566 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-inventory\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.857252 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-inventory\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.857368 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ht7\" (UniqueName: \"kubernetes.io/projected/d5635ad5-e918-492f-b2b3-8e8893ba73e1-kube-api-access-88ht7\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.857400 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.857456 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.858519 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ceph\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.864014 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ceph\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.864431 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-inventory\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.865513 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.866045 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:14 crc kubenswrapper[4789]: I1216 08:40:14.872245 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ht7\" (UniqueName: \"kubernetes.io/projected/d5635ad5-e918-492f-b2b3-8e8893ba73e1-kube-api-access-88ht7\") pod \"bootstrap-openstack-openstack-cell1-zf49z\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:15 crc kubenswrapper[4789]: I1216 08:40:15.056517 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:40:15 crc kubenswrapper[4789]: I1216 08:40:15.551215 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zf49z"] Dec 16 08:40:15 crc kubenswrapper[4789]: I1216 08:40:15.561261 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:40:15 crc kubenswrapper[4789]: I1216 08:40:15.607036 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" event={"ID":"d5635ad5-e918-492f-b2b3-8e8893ba73e1","Type":"ContainerStarted","Data":"6fa77c49519641e37aec6166e72c013e7bf63656d8f61f15e49783914036e977"} Dec 16 08:40:16 crc kubenswrapper[4789]: I1216 08:40:16.619379 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" event={"ID":"d5635ad5-e918-492f-b2b3-8e8893ba73e1","Type":"ContainerStarted","Data":"97098f0077cdb293abfdb330d1728ab7daca5ea5fe202459ba65544bff133855"} Dec 16 08:40:16 crc kubenswrapper[4789]: I1216 08:40:16.641624 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" podStartSLOduration=2.008498845 podStartE2EDuration="2.64160201s" podCreationTimestamp="2025-12-16 08:40:14 +0000 UTC" firstStartedPulling="2025-12-16 08:40:15.561066559 +0000 UTC m=+6553.822954188" lastFinishedPulling="2025-12-16 08:40:16.194169724 +0000 UTC m=+6554.456057353" observedRunningTime="2025-12-16 08:40:16.634588979 +0000 UTC m=+6554.896476608" watchObservedRunningTime="2025-12-16 08:40:16.64160201 +0000 UTC m=+6554.903489639" Dec 16 08:42:21 crc kubenswrapper[4789]: I1216 08:42:21.927626 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:42:21 crc kubenswrapper[4789]: I1216 08:42:21.928130 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.497859 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dk2zp"] Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.501331 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.515597 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dk2zp"] Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.679686 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8kqk\" (UniqueName: \"kubernetes.io/projected/28b68203-41df-4370-a398-a5ac9da35663-kube-api-access-w8kqk\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.680103 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-catalog-content\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.680181 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-utilities\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.781512 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8kqk\" (UniqueName: \"kubernetes.io/projected/28b68203-41df-4370-a398-a5ac9da35663-kube-api-access-w8kqk\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.781586 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-catalog-content\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.781640 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-utilities\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.782130 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-utilities\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.782373 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-catalog-content\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.803951 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8kqk\" (UniqueName: \"kubernetes.io/projected/28b68203-41df-4370-a398-a5ac9da35663-kube-api-access-w8kqk\") pod \"redhat-operators-dk2zp\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:29 crc kubenswrapper[4789]: I1216 08:42:29.825278 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.363076 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dk2zp"] Dec 16 08:42:30 crc kubenswrapper[4789]: W1216 08:42:30.369325 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28b68203_41df_4370_a398_a5ac9da35663.slice/crio-5354f93646945b71f80ebf103aec6595caf204a156610dfd28f0a5cdb2f5ac1f WatchSource:0}: Error finding container 5354f93646945b71f80ebf103aec6595caf204a156610dfd28f0a5cdb2f5ac1f: Status 404 returned error can't find the container with id 5354f93646945b71f80ebf103aec6595caf204a156610dfd28f0a5cdb2f5ac1f Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.490312 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tf5k4"] Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.517868 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.525330 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf5k4"] Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.623319 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd7nc\" (UniqueName: \"kubernetes.io/projected/3fa54613-7e1e-40b8-ac80-fbb02f09c667-kube-api-access-cd7nc\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.623379 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-utilities\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.623553 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-catalog-content\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.725787 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd7nc\" (UniqueName: \"kubernetes.io/projected/3fa54613-7e1e-40b8-ac80-fbb02f09c667-kube-api-access-cd7nc\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.725879 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-utilities\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.726015 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-catalog-content\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.726575 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-utilities\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.726593 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-catalog-content\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.747782 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd7nc\" (UniqueName: \"kubernetes.io/projected/3fa54613-7e1e-40b8-ac80-fbb02f09c667-kube-api-access-cd7nc\") pod \"community-operators-tf5k4\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.765217 4789 generic.go:334] "Generic (PLEG): container finished" podID="28b68203-41df-4370-a398-a5ac9da35663" containerID="d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa" exitCode=0 Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.765271 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk2zp" event={"ID":"28b68203-41df-4370-a398-a5ac9da35663","Type":"ContainerDied","Data":"d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa"} Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.765304 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk2zp" event={"ID":"28b68203-41df-4370-a398-a5ac9da35663","Type":"ContainerStarted","Data":"5354f93646945b71f80ebf103aec6595caf204a156610dfd28f0a5cdb2f5ac1f"} Dec 16 08:42:30 crc kubenswrapper[4789]: I1216 08:42:30.865342 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:31 crc kubenswrapper[4789]: W1216 08:42:31.399448 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa54613_7e1e_40b8_ac80_fbb02f09c667.slice/crio-d14fbf9c6300afa8a922394ab48137faa0f8953ddc6ff7ef7ac56e251bf827a9 WatchSource:0}: Error finding container d14fbf9c6300afa8a922394ab48137faa0f8953ddc6ff7ef7ac56e251bf827a9: Status 404 returned error can't find the container with id d14fbf9c6300afa8a922394ab48137faa0f8953ddc6ff7ef7ac56e251bf827a9 Dec 16 08:42:31 crc kubenswrapper[4789]: I1216 08:42:31.406527 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf5k4"] Dec 16 08:42:31 crc kubenswrapper[4789]: I1216 08:42:31.775406 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk2zp" event={"ID":"28b68203-41df-4370-a398-a5ac9da35663","Type":"ContainerStarted","Data":"ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695"} Dec 16 08:42:31 crc kubenswrapper[4789]: I1216 08:42:31.777881 4789 generic.go:334] "Generic (PLEG): container finished" podID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerID="ef4ecaa847b16cc7ae4616b544b731a7e0209751a19bbff339ade24314d90ecb" exitCode=0 Dec 16 08:42:31 crc kubenswrapper[4789]: I1216 08:42:31.777953 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5k4" event={"ID":"3fa54613-7e1e-40b8-ac80-fbb02f09c667","Type":"ContainerDied","Data":"ef4ecaa847b16cc7ae4616b544b731a7e0209751a19bbff339ade24314d90ecb"} Dec 16 08:42:31 crc kubenswrapper[4789]: I1216 08:42:31.777989 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5k4" event={"ID":"3fa54613-7e1e-40b8-ac80-fbb02f09c667","Type":"ContainerStarted","Data":"d14fbf9c6300afa8a922394ab48137faa0f8953ddc6ff7ef7ac56e251bf827a9"} Dec 16 08:42:33 crc kubenswrapper[4789]: I1216 08:42:33.798555 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5k4" event={"ID":"3fa54613-7e1e-40b8-ac80-fbb02f09c667","Type":"ContainerStarted","Data":"9d5c7eb2c0e5d36be155784195c84235a58219e80ff9460bc7d7a828060dd70e"} Dec 16 08:42:34 crc kubenswrapper[4789]: I1216 08:42:34.810327 4789 generic.go:334] "Generic (PLEG): container finished" podID="28b68203-41df-4370-a398-a5ac9da35663" containerID="ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695" exitCode=0 Dec 16 08:42:34 crc kubenswrapper[4789]: I1216 08:42:34.810410 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk2zp" event={"ID":"28b68203-41df-4370-a398-a5ac9da35663","Type":"ContainerDied","Data":"ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695"} Dec 16 08:42:35 crc kubenswrapper[4789]: I1216 08:42:35.821164 4789 generic.go:334] "Generic (PLEG): container finished" podID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerID="9d5c7eb2c0e5d36be155784195c84235a58219e80ff9460bc7d7a828060dd70e" exitCode=0 Dec 16 08:42:35 crc kubenswrapper[4789]: I1216 08:42:35.821250 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5k4" event={"ID":"3fa54613-7e1e-40b8-ac80-fbb02f09c667","Type":"ContainerDied","Data":"9d5c7eb2c0e5d36be155784195c84235a58219e80ff9460bc7d7a828060dd70e"} Dec 16 08:42:35 crc kubenswrapper[4789]: I1216 08:42:35.826791 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk2zp" event={"ID":"28b68203-41df-4370-a398-a5ac9da35663","Type":"ContainerStarted","Data":"9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910"} Dec 16 08:42:35 crc kubenswrapper[4789]: I1216 08:42:35.866546 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dk2zp" podStartSLOduration=2.29597513 podStartE2EDuration="6.866524065s" podCreationTimestamp="2025-12-16 08:42:29 +0000 UTC" firstStartedPulling="2025-12-16 08:42:30.767249032 +0000 UTC m=+6689.029136661" lastFinishedPulling="2025-12-16 08:42:35.337797967 +0000 UTC m=+6693.599685596" observedRunningTime="2025-12-16 08:42:35.858519019 +0000 UTC m=+6694.120406648" watchObservedRunningTime="2025-12-16 08:42:35.866524065 +0000 UTC m=+6694.128411694" Dec 16 08:42:36 crc kubenswrapper[4789]: I1216 08:42:36.839203 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5k4" event={"ID":"3fa54613-7e1e-40b8-ac80-fbb02f09c667","Type":"ContainerStarted","Data":"077d261d3f3503f641f6a500f94915ae6eab6e804b7d927092613ca61552669a"} Dec 16 08:42:36 crc kubenswrapper[4789]: I1216 08:42:36.863451 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tf5k4" podStartSLOduration=2.355864538 podStartE2EDuration="6.863426396s" podCreationTimestamp="2025-12-16 08:42:30 +0000 UTC" firstStartedPulling="2025-12-16 08:42:31.779437885 +0000 UTC m=+6690.041325524" lastFinishedPulling="2025-12-16 08:42:36.286999753 +0000 UTC m=+6694.548887382" observedRunningTime="2025-12-16 08:42:36.86195302 +0000 UTC m=+6695.123840649" watchObservedRunningTime="2025-12-16 08:42:36.863426396 +0000 UTC m=+6695.125314025" Dec 16 08:42:39 crc kubenswrapper[4789]: I1216 08:42:39.825469 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:39 crc kubenswrapper[4789]: I1216 08:42:39.825936 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:40 crc kubenswrapper[4789]: I1216 08:42:40.866028 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:40 crc kubenswrapper[4789]: I1216 08:42:40.866441 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:40 crc kubenswrapper[4789]: I1216 08:42:40.892095 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dk2zp" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="registry-server" probeResult="failure" output=< Dec 16 08:42:40 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 08:42:40 crc kubenswrapper[4789]: > Dec 16 08:42:40 crc kubenswrapper[4789]: I1216 08:42:40.913105 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:41 crc kubenswrapper[4789]: I1216 08:42:41.935151 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:44 crc kubenswrapper[4789]: I1216 08:42:44.676840 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf5k4"] Dec 16 08:42:44 crc kubenswrapper[4789]: I1216 08:42:44.677369 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tf5k4" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="registry-server" containerID="cri-o://077d261d3f3503f641f6a500f94915ae6eab6e804b7d927092613ca61552669a" gracePeriod=2 Dec 16 08:42:44 crc kubenswrapper[4789]: I1216 08:42:44.915400 4789 generic.go:334] "Generic (PLEG): container finished" podID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerID="077d261d3f3503f641f6a500f94915ae6eab6e804b7d927092613ca61552669a" exitCode=0 Dec 16 08:42:44 crc kubenswrapper[4789]: I1216 08:42:44.915495 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5k4" event={"ID":"3fa54613-7e1e-40b8-ac80-fbb02f09c667","Type":"ContainerDied","Data":"077d261d3f3503f641f6a500f94915ae6eab6e804b7d927092613ca61552669a"} Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.170389 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.320332 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd7nc\" (UniqueName: \"kubernetes.io/projected/3fa54613-7e1e-40b8-ac80-fbb02f09c667-kube-api-access-cd7nc\") pod \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.320507 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-utilities\") pod \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.320527 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-catalog-content\") pod \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\" (UID: \"3fa54613-7e1e-40b8-ac80-fbb02f09c667\") " Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.321860 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-utilities" (OuterVolumeSpecName: "utilities") pod "3fa54613-7e1e-40b8-ac80-fbb02f09c667" (UID: "3fa54613-7e1e-40b8-ac80-fbb02f09c667"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.326419 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa54613-7e1e-40b8-ac80-fbb02f09c667-kube-api-access-cd7nc" (OuterVolumeSpecName: "kube-api-access-cd7nc") pod "3fa54613-7e1e-40b8-ac80-fbb02f09c667" (UID: "3fa54613-7e1e-40b8-ac80-fbb02f09c667"). InnerVolumeSpecName "kube-api-access-cd7nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.373505 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fa54613-7e1e-40b8-ac80-fbb02f09c667" (UID: "3fa54613-7e1e-40b8-ac80-fbb02f09c667"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.422519 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.422556 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa54613-7e1e-40b8-ac80-fbb02f09c667-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.422569 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd7nc\" (UniqueName: \"kubernetes.io/projected/3fa54613-7e1e-40b8-ac80-fbb02f09c667-kube-api-access-cd7nc\") on node \"crc\" DevicePath \"\"" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.925732 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5k4" event={"ID":"3fa54613-7e1e-40b8-ac80-fbb02f09c667","Type":"ContainerDied","Data":"d14fbf9c6300afa8a922394ab48137faa0f8953ddc6ff7ef7ac56e251bf827a9"} Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.926109 4789 scope.go:117] "RemoveContainer" containerID="077d261d3f3503f641f6a500f94915ae6eab6e804b7d927092613ca61552669a" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.925794 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5k4" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.960432 4789 scope.go:117] "RemoveContainer" containerID="9d5c7eb2c0e5d36be155784195c84235a58219e80ff9460bc7d7a828060dd70e" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.967652 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf5k4"] Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.981983 4789 scope.go:117] "RemoveContainer" containerID="ef4ecaa847b16cc7ae4616b544b731a7e0209751a19bbff339ade24314d90ecb" Dec 16 08:42:45 crc kubenswrapper[4789]: I1216 08:42:45.988900 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tf5k4"] Dec 16 08:42:46 crc kubenswrapper[4789]: I1216 08:42:46.116613 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" path="/var/lib/kubelet/pods/3fa54613-7e1e-40b8-ac80-fbb02f09c667/volumes" Dec 16 08:42:49 crc kubenswrapper[4789]: I1216 08:42:49.920519 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:49 crc kubenswrapper[4789]: I1216 08:42:49.975639 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:50 crc kubenswrapper[4789]: I1216 08:42:50.159789 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dk2zp"] Dec 16 08:42:50 crc kubenswrapper[4789]: I1216 08:42:50.972131 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dk2zp" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="registry-server" containerID="cri-o://9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910" gracePeriod=2 Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.497525 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.648597 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-catalog-content\") pod \"28b68203-41df-4370-a398-a5ac9da35663\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.648857 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8kqk\" (UniqueName: \"kubernetes.io/projected/28b68203-41df-4370-a398-a5ac9da35663-kube-api-access-w8kqk\") pod \"28b68203-41df-4370-a398-a5ac9da35663\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.648978 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-utilities\") pod \"28b68203-41df-4370-a398-a5ac9da35663\" (UID: \"28b68203-41df-4370-a398-a5ac9da35663\") " Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.649675 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-utilities" (OuterVolumeSpecName: "utilities") pod "28b68203-41df-4370-a398-a5ac9da35663" (UID: "28b68203-41df-4370-a398-a5ac9da35663"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.655693 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b68203-41df-4370-a398-a5ac9da35663-kube-api-access-w8kqk" (OuterVolumeSpecName: "kube-api-access-w8kqk") pod "28b68203-41df-4370-a398-a5ac9da35663" (UID: "28b68203-41df-4370-a398-a5ac9da35663"). InnerVolumeSpecName "kube-api-access-w8kqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.747037 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28b68203-41df-4370-a398-a5ac9da35663" (UID: "28b68203-41df-4370-a398-a5ac9da35663"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.751269 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.751300 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28b68203-41df-4370-a398-a5ac9da35663-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.751313 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8kqk\" (UniqueName: \"kubernetes.io/projected/28b68203-41df-4370-a398-a5ac9da35663-kube-api-access-w8kqk\") on node \"crc\" DevicePath \"\"" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.927674 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.927745 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.981827 4789 generic.go:334] "Generic (PLEG): container finished" podID="28b68203-41df-4370-a398-a5ac9da35663" containerID="9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910" exitCode=0 Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.981875 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk2zp" event={"ID":"28b68203-41df-4370-a398-a5ac9da35663","Type":"ContainerDied","Data":"9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910"} Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.981904 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dk2zp" event={"ID":"28b68203-41df-4370-a398-a5ac9da35663","Type":"ContainerDied","Data":"5354f93646945b71f80ebf103aec6595caf204a156610dfd28f0a5cdb2f5ac1f"} Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.981941 4789 scope.go:117] "RemoveContainer" containerID="9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910" Dec 16 08:42:51 crc kubenswrapper[4789]: I1216 08:42:51.982097 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dk2zp" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.014445 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dk2zp"] Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.018127 4789 scope.go:117] "RemoveContainer" containerID="ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.022753 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dk2zp"] Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.047725 4789 scope.go:117] "RemoveContainer" containerID="d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.084247 4789 scope.go:117] "RemoveContainer" containerID="9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910" Dec 16 08:42:52 crc kubenswrapper[4789]: E1216 08:42:52.084724 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910\": container with ID starting with 9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910 not found: ID does not exist" containerID="9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.084757 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910"} err="failed to get container status \"9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910\": rpc error: code = NotFound desc = could not find container \"9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910\": container with ID starting with 9ca65671c45fff525860a68672e311eb434dd0084afdbf833dcd8323ae2d3910 not found: ID does not exist" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.084785 4789 scope.go:117] "RemoveContainer" containerID="ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695" Dec 16 08:42:52 crc kubenswrapper[4789]: E1216 08:42:52.085165 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695\": container with ID starting with ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695 not found: ID does not exist" containerID="ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.085196 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695"} err="failed to get container status \"ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695\": rpc error: code = NotFound desc = could not find container \"ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695\": container with ID starting with ec6c742dda880a4501cf623df4829655caa4ed25101e63e4d07f8f685a1ab695 not found: ID does not exist" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.085213 4789 scope.go:117] "RemoveContainer" containerID="d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa" Dec 16 08:42:52 crc kubenswrapper[4789]: E1216 08:42:52.085619 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa\": container with ID starting with d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa not found: ID does not exist" containerID="d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.085664 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa"} err="failed to get container status \"d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa\": rpc error: code = NotFound desc = could not find container \"d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa\": container with ID starting with d8c86c6602e1191dd69b27e9116d2a2c92b07e13f4bd85a904c546e6048a16aa not found: ID does not exist" Dec 16 08:42:52 crc kubenswrapper[4789]: I1216 08:42:52.116253 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b68203-41df-4370-a398-a5ac9da35663" path="/var/lib/kubelet/pods/28b68203-41df-4370-a398-a5ac9da35663/volumes" Dec 16 08:43:18 crc kubenswrapper[4789]: I1216 08:43:18.221949 4789 generic.go:334] "Generic (PLEG): container finished" podID="d5635ad5-e918-492f-b2b3-8e8893ba73e1" containerID="97098f0077cdb293abfdb330d1728ab7daca5ea5fe202459ba65544bff133855" exitCode=0 Dec 16 08:43:18 crc kubenswrapper[4789]: I1216 08:43:18.222046 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" event={"ID":"d5635ad5-e918-492f-b2b3-8e8893ba73e1","Type":"ContainerDied","Data":"97098f0077cdb293abfdb330d1728ab7daca5ea5fe202459ba65544bff133855"} Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.658251 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.810932 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ceph\") pod \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.811063 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88ht7\" (UniqueName: \"kubernetes.io/projected/d5635ad5-e918-492f-b2b3-8e8893ba73e1-kube-api-access-88ht7\") pod \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.811137 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-bootstrap-combined-ca-bundle\") pod \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.811170 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ssh-key\") pod \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.811205 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-inventory\") pod \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\" (UID: \"d5635ad5-e918-492f-b2b3-8e8893ba73e1\") " Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.816335 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ceph" (OuterVolumeSpecName: "ceph") pod "d5635ad5-e918-492f-b2b3-8e8893ba73e1" (UID: "d5635ad5-e918-492f-b2b3-8e8893ba73e1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.816355 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d5635ad5-e918-492f-b2b3-8e8893ba73e1" (UID: "d5635ad5-e918-492f-b2b3-8e8893ba73e1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.816693 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5635ad5-e918-492f-b2b3-8e8893ba73e1-kube-api-access-88ht7" (OuterVolumeSpecName: "kube-api-access-88ht7") pod "d5635ad5-e918-492f-b2b3-8e8893ba73e1" (UID: "d5635ad5-e918-492f-b2b3-8e8893ba73e1"). InnerVolumeSpecName "kube-api-access-88ht7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.837107 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5635ad5-e918-492f-b2b3-8e8893ba73e1" (UID: "d5635ad5-e918-492f-b2b3-8e8893ba73e1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.838850 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-inventory" (OuterVolumeSpecName: "inventory") pod "d5635ad5-e918-492f-b2b3-8e8893ba73e1" (UID: "d5635ad5-e918-492f-b2b3-8e8893ba73e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.914105 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.914151 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88ht7\" (UniqueName: \"kubernetes.io/projected/d5635ad5-e918-492f-b2b3-8e8893ba73e1-kube-api-access-88ht7\") on node \"crc\" DevicePath \"\"" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.914167 4789 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.914177 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:43:19 crc kubenswrapper[4789]: I1216 08:43:19.914189 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5635ad5-e918-492f-b2b3-8e8893ba73e1-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.244712 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" event={"ID":"d5635ad5-e918-492f-b2b3-8e8893ba73e1","Type":"ContainerDied","Data":"6fa77c49519641e37aec6166e72c013e7bf63656d8f61f15e49783914036e977"} Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.244776 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa77c49519641e37aec6166e72c013e7bf63656d8f61f15e49783914036e977" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.244783 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zf49z" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.343683 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-9qlnq"] Dec 16 08:43:20 crc kubenswrapper[4789]: E1216 08:43:20.344528 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="registry-server" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.344626 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="registry-server" Dec 16 08:43:20 crc kubenswrapper[4789]: E1216 08:43:20.344716 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="extract-content" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.344782 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="extract-content" Dec 16 08:43:20 crc kubenswrapper[4789]: E1216 08:43:20.344875 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5635ad5-e918-492f-b2b3-8e8893ba73e1" containerName="bootstrap-openstack-openstack-cell1" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.344981 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5635ad5-e918-492f-b2b3-8e8893ba73e1" containerName="bootstrap-openstack-openstack-cell1" Dec 16 08:43:20 crc kubenswrapper[4789]: E1216 08:43:20.345072 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="extract-utilities" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.345139 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="extract-utilities" Dec 16 08:43:20 crc kubenswrapper[4789]: E1216 08:43:20.345234 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="registry-server" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.345301 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="registry-server" Dec 16 08:43:20 crc kubenswrapper[4789]: E1216 08:43:20.345381 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="extract-content" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.345456 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="extract-content" Dec 16 08:43:20 crc kubenswrapper[4789]: E1216 08:43:20.345532 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="extract-utilities" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.345634 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="extract-utilities" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.346031 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5635ad5-e918-492f-b2b3-8e8893ba73e1" containerName="bootstrap-openstack-openstack-cell1" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.346106 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b68203-41df-4370-a398-a5ac9da35663" containerName="registry-server" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.346176 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa54613-7e1e-40b8-ac80-fbb02f09c667" containerName="registry-server" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.347144 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.349624 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.349900 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.350101 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.350227 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.357442 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-9qlnq"] Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.527174 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-inventory\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.527465 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ceph\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.527519 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ssh-key\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.527697 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg5tx\" (UniqueName: \"kubernetes.io/projected/4eaf1fcf-a995-49f8-89b5-fb771151402f-kube-api-access-cg5tx\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.631968 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-inventory\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.632491 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ceph\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.632589 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ssh-key\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.632669 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg5tx\" (UniqueName: \"kubernetes.io/projected/4eaf1fcf-a995-49f8-89b5-fb771151402f-kube-api-access-cg5tx\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.636437 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ssh-key\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.637160 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-inventory\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.638170 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ceph\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.660289 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg5tx\" (UniqueName: \"kubernetes.io/projected/4eaf1fcf-a995-49f8-89b5-fb771151402f-kube-api-access-cg5tx\") pod \"download-cache-openstack-openstack-cell1-9qlnq\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:20 crc kubenswrapper[4789]: I1216 08:43:20.670214 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:43:21 crc kubenswrapper[4789]: I1216 08:43:21.224282 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-9qlnq"] Dec 16 08:43:21 crc kubenswrapper[4789]: I1216 08:43:21.257837 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" event={"ID":"4eaf1fcf-a995-49f8-89b5-fb771151402f","Type":"ContainerStarted","Data":"f9614004123acfc2a975c3daa80cd3f14a6e6716b07d3fe58ed5838b99775914"} Dec 16 08:43:21 crc kubenswrapper[4789]: I1216 08:43:21.928161 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:43:21 crc kubenswrapper[4789]: I1216 08:43:21.928477 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:43:21 crc kubenswrapper[4789]: I1216 08:43:21.928513 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:43:21 crc kubenswrapper[4789]: I1216 08:43:21.929237 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:43:21 crc kubenswrapper[4789]: I1216 08:43:21.929289 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" gracePeriod=600 Dec 16 08:43:22 crc kubenswrapper[4789]: E1216 08:43:22.060490 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:43:22 crc kubenswrapper[4789]: I1216 08:43:22.271844 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" exitCode=0 Dec 16 08:43:22 crc kubenswrapper[4789]: I1216 08:43:22.271925 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7"} Dec 16 08:43:22 crc kubenswrapper[4789]: I1216 08:43:22.271997 4789 scope.go:117] "RemoveContainer" containerID="e96559da65a9a61ec50ea75abb21de6d4bc35a43a9cc3128626162b522be7ef7" Dec 16 08:43:22 crc kubenswrapper[4789]: I1216 08:43:22.273115 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:43:22 crc kubenswrapper[4789]: E1216 08:43:22.273454 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:43:23 crc kubenswrapper[4789]: I1216 08:43:23.282894 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" event={"ID":"4eaf1fcf-a995-49f8-89b5-fb771151402f","Type":"ContainerStarted","Data":"b1efff17fe8229316ac2ac87cc5974506d20aabe09afdacfbf5e7db4593bf6f4"} Dec 16 08:43:23 crc kubenswrapper[4789]: I1216 08:43:23.304832 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" podStartSLOduration=1.764158222 podStartE2EDuration="3.304810738s" podCreationTimestamp="2025-12-16 08:43:20 +0000 UTC" firstStartedPulling="2025-12-16 08:43:21.22666589 +0000 UTC m=+6739.488553519" lastFinishedPulling="2025-12-16 08:43:22.767318406 +0000 UTC m=+6741.029206035" observedRunningTime="2025-12-16 08:43:23.301243922 +0000 UTC m=+6741.563131561" watchObservedRunningTime="2025-12-16 08:43:23.304810738 +0000 UTC m=+6741.566698367" Dec 16 08:43:36 crc kubenswrapper[4789]: I1216 08:43:36.105313 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:43:36 crc kubenswrapper[4789]: E1216 08:43:36.106198 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:43:47 crc kubenswrapper[4789]: I1216 08:43:47.105837 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:43:47 crc kubenswrapper[4789]: E1216 08:43:47.106713 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:44:00 crc kubenswrapper[4789]: I1216 08:44:00.104955 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:44:00 crc kubenswrapper[4789]: E1216 08:44:00.105839 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:44:14 crc kubenswrapper[4789]: I1216 08:44:14.105681 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:44:14 crc kubenswrapper[4789]: E1216 08:44:14.106454 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:44:28 crc kubenswrapper[4789]: I1216 08:44:28.105021 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:44:28 crc kubenswrapper[4789]: E1216 08:44:28.105764 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:44:39 crc kubenswrapper[4789]: I1216 08:44:39.105985 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:44:39 crc kubenswrapper[4789]: E1216 08:44:39.107968 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:44:52 crc kubenswrapper[4789]: I1216 08:44:52.113364 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:44:52 crc kubenswrapper[4789]: E1216 08:44:52.114762 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.144887 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8"] Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.149014 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.154142 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.154460 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.163332 4789 generic.go:334] "Generic (PLEG): container finished" podID="4eaf1fcf-a995-49f8-89b5-fb771151402f" containerID="b1efff17fe8229316ac2ac87cc5974506d20aabe09afdacfbf5e7db4593bf6f4" exitCode=0 Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.163379 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" event={"ID":"4eaf1fcf-a995-49f8-89b5-fb771151402f","Type":"ContainerDied","Data":"b1efff17fe8229316ac2ac87cc5974506d20aabe09afdacfbf5e7db4593bf6f4"} Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.163830 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8"] Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.231058 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chw5r\" (UniqueName: \"kubernetes.io/projected/8549517b-5f73-46a0-805f-2c30803def4a-kube-api-access-chw5r\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.231304 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8549517b-5f73-46a0-805f-2c30803def4a-secret-volume\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.231551 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8549517b-5f73-46a0-805f-2c30803def4a-config-volume\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.333285 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8549517b-5f73-46a0-805f-2c30803def4a-secret-volume\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.333439 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8549517b-5f73-46a0-805f-2c30803def4a-config-volume\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.333498 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chw5r\" (UniqueName: \"kubernetes.io/projected/8549517b-5f73-46a0-805f-2c30803def4a-kube-api-access-chw5r\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.334442 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8549517b-5f73-46a0-805f-2c30803def4a-config-volume\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.359540 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8549517b-5f73-46a0-805f-2c30803def4a-secret-volume\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.360692 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chw5r\" (UniqueName: \"kubernetes.io/projected/8549517b-5f73-46a0-805f-2c30803def4a-kube-api-access-chw5r\") pod \"collect-profiles-29431245-zk4h8\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.485163 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:00 crc kubenswrapper[4789]: I1216 08:45:00.957999 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8"] Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.177595 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" event={"ID":"8549517b-5f73-46a0-805f-2c30803def4a","Type":"ContainerStarted","Data":"961726de38b39a84f1131521ea58dcfefb99e0133cf780baa96fb75be604e06f"} Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.177659 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" event={"ID":"8549517b-5f73-46a0-805f-2c30803def4a","Type":"ContainerStarted","Data":"bd518ba4657ceda2a8a060295badcb02e8d700db966529fa059d07d97e2a9b7a"} Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.204080 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" podStartSLOduration=1.204047167 podStartE2EDuration="1.204047167s" podCreationTimestamp="2025-12-16 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 08:45:01.194104055 +0000 UTC m=+6839.455991684" watchObservedRunningTime="2025-12-16 08:45:01.204047167 +0000 UTC m=+6839.465934796" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.686384 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.765997 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg5tx\" (UniqueName: \"kubernetes.io/projected/4eaf1fcf-a995-49f8-89b5-fb771151402f-kube-api-access-cg5tx\") pod \"4eaf1fcf-a995-49f8-89b5-fb771151402f\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.766342 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-inventory\") pod \"4eaf1fcf-a995-49f8-89b5-fb771151402f\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.766431 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ssh-key\") pod \"4eaf1fcf-a995-49f8-89b5-fb771151402f\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.766585 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ceph\") pod \"4eaf1fcf-a995-49f8-89b5-fb771151402f\" (UID: \"4eaf1fcf-a995-49f8-89b5-fb771151402f\") " Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.773225 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ceph" (OuterVolumeSpecName: "ceph") pod "4eaf1fcf-a995-49f8-89b5-fb771151402f" (UID: "4eaf1fcf-a995-49f8-89b5-fb771151402f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.773813 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eaf1fcf-a995-49f8-89b5-fb771151402f-kube-api-access-cg5tx" (OuterVolumeSpecName: "kube-api-access-cg5tx") pod "4eaf1fcf-a995-49f8-89b5-fb771151402f" (UID: "4eaf1fcf-a995-49f8-89b5-fb771151402f"). InnerVolumeSpecName "kube-api-access-cg5tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.798747 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4eaf1fcf-a995-49f8-89b5-fb771151402f" (UID: "4eaf1fcf-a995-49f8-89b5-fb771151402f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.802586 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-inventory" (OuterVolumeSpecName: "inventory") pod "4eaf1fcf-a995-49f8-89b5-fb771151402f" (UID: "4eaf1fcf-a995-49f8-89b5-fb771151402f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.869413 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.869693 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.869752 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4eaf1fcf-a995-49f8-89b5-fb771151402f-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:01 crc kubenswrapper[4789]: I1216 08:45:01.869812 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg5tx\" (UniqueName: \"kubernetes.io/projected/4eaf1fcf-a995-49f8-89b5-fb771151402f-kube-api-access-cg5tx\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.190459 4789 generic.go:334] "Generic (PLEG): container finished" podID="8549517b-5f73-46a0-805f-2c30803def4a" containerID="961726de38b39a84f1131521ea58dcfefb99e0133cf780baa96fb75be604e06f" exitCode=0 Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.190570 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" event={"ID":"8549517b-5f73-46a0-805f-2c30803def4a","Type":"ContainerDied","Data":"961726de38b39a84f1131521ea58dcfefb99e0133cf780baa96fb75be604e06f"} Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.195670 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" event={"ID":"4eaf1fcf-a995-49f8-89b5-fb771151402f","Type":"ContainerDied","Data":"f9614004123acfc2a975c3daa80cd3f14a6e6716b07d3fe58ed5838b99775914"} Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.195719 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9614004123acfc2a975c3daa80cd3f14a6e6716b07d3fe58ed5838b99775914" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.195780 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-9qlnq" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.307815 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-v8jrp"] Dec 16 08:45:02 crc kubenswrapper[4789]: E1216 08:45:02.308300 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eaf1fcf-a995-49f8-89b5-fb771151402f" containerName="download-cache-openstack-openstack-cell1" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.308325 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eaf1fcf-a995-49f8-89b5-fb771151402f" containerName="download-cache-openstack-openstack-cell1" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.308514 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eaf1fcf-a995-49f8-89b5-fb771151402f" containerName="download-cache-openstack-openstack-cell1" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.309195 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.311889 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.312273 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.312620 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.312822 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.317217 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-v8jrp"] Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.379707 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r427x\" (UniqueName: \"kubernetes.io/projected/7e556aea-3590-4797-a0f7-27cfbc22be03-kube-api-access-r427x\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.379765 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-inventory\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.380056 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ssh-key\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.380271 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ceph\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.481967 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ceph\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.482157 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r427x\" (UniqueName: \"kubernetes.io/projected/7e556aea-3590-4797-a0f7-27cfbc22be03-kube-api-access-r427x\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.482194 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-inventory\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.482299 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ssh-key\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.487585 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-inventory\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.488413 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ssh-key\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.488513 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ceph\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.507727 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r427x\" (UniqueName: \"kubernetes.io/projected/7e556aea-3590-4797-a0f7-27cfbc22be03-kube-api-access-r427x\") pod \"configure-network-openstack-openstack-cell1-v8jrp\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:02 crc kubenswrapper[4789]: I1216 08:45:02.631388 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.226894 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-v8jrp"] Dec 16 08:45:03 crc kubenswrapper[4789]: W1216 08:45:03.231959 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e556aea_3590_4797_a0f7_27cfbc22be03.slice/crio-325e24fa023e0c9b3197ef76f99f1875789a81fba02f25a68ebf90192fad1faf WatchSource:0}: Error finding container 325e24fa023e0c9b3197ef76f99f1875789a81fba02f25a68ebf90192fad1faf: Status 404 returned error can't find the container with id 325e24fa023e0c9b3197ef76f99f1875789a81fba02f25a68ebf90192fad1faf Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.444725 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.516689 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chw5r\" (UniqueName: \"kubernetes.io/projected/8549517b-5f73-46a0-805f-2c30803def4a-kube-api-access-chw5r\") pod \"8549517b-5f73-46a0-805f-2c30803def4a\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.516838 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8549517b-5f73-46a0-805f-2c30803def4a-secret-volume\") pod \"8549517b-5f73-46a0-805f-2c30803def4a\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.516986 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8549517b-5f73-46a0-805f-2c30803def4a-config-volume\") pod \"8549517b-5f73-46a0-805f-2c30803def4a\" (UID: \"8549517b-5f73-46a0-805f-2c30803def4a\") " Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.517739 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8549517b-5f73-46a0-805f-2c30803def4a-config-volume" (OuterVolumeSpecName: "config-volume") pod "8549517b-5f73-46a0-805f-2c30803def4a" (UID: "8549517b-5f73-46a0-805f-2c30803def4a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.526414 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8549517b-5f73-46a0-805f-2c30803def4a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8549517b-5f73-46a0-805f-2c30803def4a" (UID: "8549517b-5f73-46a0-805f-2c30803def4a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.526706 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8549517b-5f73-46a0-805f-2c30803def4a-kube-api-access-chw5r" (OuterVolumeSpecName: "kube-api-access-chw5r") pod "8549517b-5f73-46a0-805f-2c30803def4a" (UID: "8549517b-5f73-46a0-805f-2c30803def4a"). InnerVolumeSpecName "kube-api-access-chw5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.619982 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chw5r\" (UniqueName: \"kubernetes.io/projected/8549517b-5f73-46a0-805f-2c30803def4a-kube-api-access-chw5r\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.620029 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8549517b-5f73-46a0-805f-2c30803def4a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:03 crc kubenswrapper[4789]: I1216 08:45:03.620048 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8549517b-5f73-46a0-805f-2c30803def4a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:04 crc kubenswrapper[4789]: I1216 08:45:04.219547 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" event={"ID":"8549517b-5f73-46a0-805f-2c30803def4a","Type":"ContainerDied","Data":"bd518ba4657ceda2a8a060295badcb02e8d700db966529fa059d07d97e2a9b7a"} Dec 16 08:45:04 crc kubenswrapper[4789]: I1216 08:45:04.219954 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd518ba4657ceda2a8a060295badcb02e8d700db966529fa059d07d97e2a9b7a" Dec 16 08:45:04 crc kubenswrapper[4789]: I1216 08:45:04.220039 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8" Dec 16 08:45:04 crc kubenswrapper[4789]: I1216 08:45:04.226301 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" event={"ID":"7e556aea-3590-4797-a0f7-27cfbc22be03","Type":"ContainerStarted","Data":"325e24fa023e0c9b3197ef76f99f1875789a81fba02f25a68ebf90192fad1faf"} Dec 16 08:45:04 crc kubenswrapper[4789]: I1216 08:45:04.273934 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl"] Dec 16 08:45:04 crc kubenswrapper[4789]: I1216 08:45:04.281298 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431200-rk6nl"] Dec 16 08:45:05 crc kubenswrapper[4789]: I1216 08:45:05.105721 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:45:05 crc kubenswrapper[4789]: E1216 08:45:05.106253 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:45:05 crc kubenswrapper[4789]: I1216 08:45:05.238158 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" event={"ID":"7e556aea-3590-4797-a0f7-27cfbc22be03","Type":"ContainerStarted","Data":"8bde120ea27c219d66bcef18574b50b055118ec67b5800599e05efc35008ae86"} Dec 16 08:45:05 crc kubenswrapper[4789]: I1216 08:45:05.264510 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" podStartSLOduration=2.3997109 podStartE2EDuration="3.264490507s" podCreationTimestamp="2025-12-16 08:45:02 +0000 UTC" firstStartedPulling="2025-12-16 08:45:03.242322614 +0000 UTC m=+6841.504210243" lastFinishedPulling="2025-12-16 08:45:04.107102221 +0000 UTC m=+6842.368989850" observedRunningTime="2025-12-16 08:45:05.258376118 +0000 UTC m=+6843.520263747" watchObservedRunningTime="2025-12-16 08:45:05.264490507 +0000 UTC m=+6843.526378136" Dec 16 08:45:06 crc kubenswrapper[4789]: I1216 08:45:06.144039 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a191b34-d6cb-4afc-accf-4ec4ba9734af" path="/var/lib/kubelet/pods/2a191b34-d6cb-4afc-accf-4ec4ba9734af/volumes" Dec 16 08:45:18 crc kubenswrapper[4789]: I1216 08:45:18.105834 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:45:18 crc kubenswrapper[4789]: E1216 08:45:18.106773 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:45:33 crc kubenswrapper[4789]: I1216 08:45:33.106295 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:45:33 crc kubenswrapper[4789]: E1216 08:45:33.107095 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:45:34 crc kubenswrapper[4789]: I1216 08:45:34.059421 4789 scope.go:117] "RemoveContainer" containerID="a7978195adc3ea814d11f59cee1995ab921d9071e34f799870d687f0e300a335" Dec 16 08:45:34 crc kubenswrapper[4789]: I1216 08:45:34.275023 4789 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod8549517b-5f73-46a0-805f-2c30803def4a"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod8549517b-5f73-46a0-805f-2c30803def4a] : Timed out while waiting for systemd to remove kubepods-burstable-pod8549517b_5f73_46a0_805f_2c30803def4a.slice" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.151200 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5q8vr"] Dec 16 08:45:40 crc kubenswrapper[4789]: E1216 08:45:40.152103 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8549517b-5f73-46a0-805f-2c30803def4a" containerName="collect-profiles" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.152117 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8549517b-5f73-46a0-805f-2c30803def4a" containerName="collect-profiles" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.152330 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8549517b-5f73-46a0-805f-2c30803def4a" containerName="collect-profiles" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.153790 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.160993 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5q8vr"] Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.279593 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-catalog-content\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.279697 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-utilities\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.279744 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqntd\" (UniqueName: \"kubernetes.io/projected/1cfe4a1e-6406-44a8-b67b-d60a398201b9-kube-api-access-zqntd\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.381485 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-catalog-content\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.381571 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-utilities\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.381601 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqntd\" (UniqueName: \"kubernetes.io/projected/1cfe4a1e-6406-44a8-b67b-d60a398201b9-kube-api-access-zqntd\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.382107 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-catalog-content\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.382185 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-utilities\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.404905 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqntd\" (UniqueName: \"kubernetes.io/projected/1cfe4a1e-6406-44a8-b67b-d60a398201b9-kube-api-access-zqntd\") pod \"certified-operators-5q8vr\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:40 crc kubenswrapper[4789]: I1216 08:45:40.477298 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:41 crc kubenswrapper[4789]: I1216 08:45:41.001095 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5q8vr"] Dec 16 08:45:41 crc kubenswrapper[4789]: I1216 08:45:41.540242 4789 generic.go:334] "Generic (PLEG): container finished" podID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerID="f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c" exitCode=0 Dec 16 08:45:41 crc kubenswrapper[4789]: I1216 08:45:41.540452 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q8vr" event={"ID":"1cfe4a1e-6406-44a8-b67b-d60a398201b9","Type":"ContainerDied","Data":"f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c"} Dec 16 08:45:41 crc kubenswrapper[4789]: I1216 08:45:41.540631 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q8vr" event={"ID":"1cfe4a1e-6406-44a8-b67b-d60a398201b9","Type":"ContainerStarted","Data":"bb881959a5111407c7ddbf6cbb93bef268f4fc8345b50e283802dfa56613cc89"} Dec 16 08:45:41 crc kubenswrapper[4789]: I1216 08:45:41.542582 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:45:42 crc kubenswrapper[4789]: I1216 08:45:42.552735 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q8vr" event={"ID":"1cfe4a1e-6406-44a8-b67b-d60a398201b9","Type":"ContainerStarted","Data":"9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf"} Dec 16 08:45:43 crc kubenswrapper[4789]: I1216 08:45:43.564493 4789 generic.go:334] "Generic (PLEG): container finished" podID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerID="9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf" exitCode=0 Dec 16 08:45:43 crc kubenswrapper[4789]: I1216 08:45:43.564542 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q8vr" event={"ID":"1cfe4a1e-6406-44a8-b67b-d60a398201b9","Type":"ContainerDied","Data":"9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf"} Dec 16 08:45:44 crc kubenswrapper[4789]: I1216 08:45:44.606184 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q8vr" event={"ID":"1cfe4a1e-6406-44a8-b67b-d60a398201b9","Type":"ContainerStarted","Data":"1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1"} Dec 16 08:45:44 crc kubenswrapper[4789]: I1216 08:45:44.632870 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5q8vr" podStartSLOduration=2.226677483 podStartE2EDuration="4.632851355s" podCreationTimestamp="2025-12-16 08:45:40 +0000 UTC" firstStartedPulling="2025-12-16 08:45:41.542365738 +0000 UTC m=+6879.804253367" lastFinishedPulling="2025-12-16 08:45:43.94853961 +0000 UTC m=+6882.210427239" observedRunningTime="2025-12-16 08:45:44.62816076 +0000 UTC m=+6882.890048389" watchObservedRunningTime="2025-12-16 08:45:44.632851355 +0000 UTC m=+6882.894738984" Dec 16 08:45:45 crc kubenswrapper[4789]: I1216 08:45:45.104873 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:45:45 crc kubenswrapper[4789]: E1216 08:45:45.105212 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:45:50 crc kubenswrapper[4789]: I1216 08:45:50.477661 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:50 crc kubenswrapper[4789]: I1216 08:45:50.478432 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:50 crc kubenswrapper[4789]: I1216 08:45:50.525798 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:50 crc kubenswrapper[4789]: I1216 08:45:50.710551 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:50 crc kubenswrapper[4789]: I1216 08:45:50.768308 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5q8vr"] Dec 16 08:45:52 crc kubenswrapper[4789]: I1216 08:45:52.683225 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5q8vr" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="registry-server" containerID="cri-o://1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1" gracePeriod=2 Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.216568 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.343284 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqntd\" (UniqueName: \"kubernetes.io/projected/1cfe4a1e-6406-44a8-b67b-d60a398201b9-kube-api-access-zqntd\") pod \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.343577 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-utilities\") pod \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.343700 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-catalog-content\") pod \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\" (UID: \"1cfe4a1e-6406-44a8-b67b-d60a398201b9\") " Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.344576 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-utilities" (OuterVolumeSpecName: "utilities") pod "1cfe4a1e-6406-44a8-b67b-d60a398201b9" (UID: "1cfe4a1e-6406-44a8-b67b-d60a398201b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.344828 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.366075 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfe4a1e-6406-44a8-b67b-d60a398201b9-kube-api-access-zqntd" (OuterVolumeSpecName: "kube-api-access-zqntd") pod "1cfe4a1e-6406-44a8-b67b-d60a398201b9" (UID: "1cfe4a1e-6406-44a8-b67b-d60a398201b9"). InnerVolumeSpecName "kube-api-access-zqntd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.410063 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cfe4a1e-6406-44a8-b67b-d60a398201b9" (UID: "1cfe4a1e-6406-44a8-b67b-d60a398201b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.446992 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cfe4a1e-6406-44a8-b67b-d60a398201b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.447027 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqntd\" (UniqueName: \"kubernetes.io/projected/1cfe4a1e-6406-44a8-b67b-d60a398201b9-kube-api-access-zqntd\") on node \"crc\" DevicePath \"\"" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.693271 4789 generic.go:334] "Generic (PLEG): container finished" podID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerID="1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1" exitCode=0 Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.693306 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q8vr" event={"ID":"1cfe4a1e-6406-44a8-b67b-d60a398201b9","Type":"ContainerDied","Data":"1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1"} Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.693294 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5q8vr" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.693334 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5q8vr" event={"ID":"1cfe4a1e-6406-44a8-b67b-d60a398201b9","Type":"ContainerDied","Data":"bb881959a5111407c7ddbf6cbb93bef268f4fc8345b50e283802dfa56613cc89"} Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.693349 4789 scope.go:117] "RemoveContainer" containerID="1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.711607 4789 scope.go:117] "RemoveContainer" containerID="9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.734437 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5q8vr"] Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.742519 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5q8vr"] Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.750504 4789 scope.go:117] "RemoveContainer" containerID="f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.781147 4789 scope.go:117] "RemoveContainer" containerID="1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1" Dec 16 08:45:53 crc kubenswrapper[4789]: E1216 08:45:53.781568 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1\": container with ID starting with 1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1 not found: ID does not exist" containerID="1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.781605 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1"} err="failed to get container status \"1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1\": rpc error: code = NotFound desc = could not find container \"1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1\": container with ID starting with 1a59ced9e5c07a41534e8c58ce92ed1b0e0c33e29bf4e8384cde889c6c81a5d1 not found: ID does not exist" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.781632 4789 scope.go:117] "RemoveContainer" containerID="9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf" Dec 16 08:45:53 crc kubenswrapper[4789]: E1216 08:45:53.782058 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf\": container with ID starting with 9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf not found: ID does not exist" containerID="9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.782089 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf"} err="failed to get container status \"9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf\": rpc error: code = NotFound desc = could not find container \"9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf\": container with ID starting with 9cb6567b1a8b0d3253ddcf901e6c8750f909859d4403390103af6bf7f986aabf not found: ID does not exist" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.782106 4789 scope.go:117] "RemoveContainer" containerID="f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c" Dec 16 08:45:53 crc kubenswrapper[4789]: E1216 08:45:53.782404 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c\": container with ID starting with f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c not found: ID does not exist" containerID="f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c" Dec 16 08:45:53 crc kubenswrapper[4789]: I1216 08:45:53.782439 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c"} err="failed to get container status \"f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c\": rpc error: code = NotFound desc = could not find container \"f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c\": container with ID starting with f046bc7dd9eb6c1a8a5d2ea8a0a0a94742d851f7210ed0b7fbb05bcbbf23158c not found: ID does not exist" Dec 16 08:45:54 crc kubenswrapper[4789]: I1216 08:45:54.117229 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" path="/var/lib/kubelet/pods/1cfe4a1e-6406-44a8-b67b-d60a398201b9/volumes" Dec 16 08:46:00 crc kubenswrapper[4789]: I1216 08:46:00.105619 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:46:00 crc kubenswrapper[4789]: E1216 08:46:00.106690 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:46:15 crc kubenswrapper[4789]: I1216 08:46:15.104717 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:46:15 crc kubenswrapper[4789]: E1216 08:46:15.105579 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:46:19 crc kubenswrapper[4789]: I1216 08:46:19.922475 4789 generic.go:334] "Generic (PLEG): container finished" podID="7e556aea-3590-4797-a0f7-27cfbc22be03" containerID="8bde120ea27c219d66bcef18574b50b055118ec67b5800599e05efc35008ae86" exitCode=0 Dec 16 08:46:19 crc kubenswrapper[4789]: I1216 08:46:19.922583 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" event={"ID":"7e556aea-3590-4797-a0f7-27cfbc22be03","Type":"ContainerDied","Data":"8bde120ea27c219d66bcef18574b50b055118ec67b5800599e05efc35008ae86"} Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.374757 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.406404 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r427x\" (UniqueName: \"kubernetes.io/projected/7e556aea-3590-4797-a0f7-27cfbc22be03-kube-api-access-r427x\") pod \"7e556aea-3590-4797-a0f7-27cfbc22be03\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.406522 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ceph\") pod \"7e556aea-3590-4797-a0f7-27cfbc22be03\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.406547 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ssh-key\") pod \"7e556aea-3590-4797-a0f7-27cfbc22be03\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.406580 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-inventory\") pod \"7e556aea-3590-4797-a0f7-27cfbc22be03\" (UID: \"7e556aea-3590-4797-a0f7-27cfbc22be03\") " Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.414343 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e556aea-3590-4797-a0f7-27cfbc22be03-kube-api-access-r427x" (OuterVolumeSpecName: "kube-api-access-r427x") pod "7e556aea-3590-4797-a0f7-27cfbc22be03" (UID: "7e556aea-3590-4797-a0f7-27cfbc22be03"). InnerVolumeSpecName "kube-api-access-r427x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.418065 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ceph" (OuterVolumeSpecName: "ceph") pod "7e556aea-3590-4797-a0f7-27cfbc22be03" (UID: "7e556aea-3590-4797-a0f7-27cfbc22be03"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.490151 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-inventory" (OuterVolumeSpecName: "inventory") pod "7e556aea-3590-4797-a0f7-27cfbc22be03" (UID: "7e556aea-3590-4797-a0f7-27cfbc22be03"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.495096 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e556aea-3590-4797-a0f7-27cfbc22be03" (UID: "7e556aea-3590-4797-a0f7-27cfbc22be03"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.511159 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r427x\" (UniqueName: \"kubernetes.io/projected/7e556aea-3590-4797-a0f7-27cfbc22be03-kube-api-access-r427x\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.511192 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.511201 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.511211 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e556aea-3590-4797-a0f7-27cfbc22be03-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.943029 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" event={"ID":"7e556aea-3590-4797-a0f7-27cfbc22be03","Type":"ContainerDied","Data":"325e24fa023e0c9b3197ef76f99f1875789a81fba02f25a68ebf90192fad1faf"} Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.943069 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="325e24fa023e0c9b3197ef76f99f1875789a81fba02f25a68ebf90192fad1faf" Dec 16 08:46:21 crc kubenswrapper[4789]: I1216 08:46:21.943084 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-v8jrp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.027562 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-4zxgp"] Dec 16 08:46:22 crc kubenswrapper[4789]: E1216 08:46:22.038616 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="extract-content" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.038851 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="extract-content" Dec 16 08:46:22 crc kubenswrapper[4789]: E1216 08:46:22.039007 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e556aea-3590-4797-a0f7-27cfbc22be03" containerName="configure-network-openstack-openstack-cell1" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.039079 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e556aea-3590-4797-a0f7-27cfbc22be03" containerName="configure-network-openstack-openstack-cell1" Dec 16 08:46:22 crc kubenswrapper[4789]: E1216 08:46:22.039135 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="extract-utilities" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.039182 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="extract-utilities" Dec 16 08:46:22 crc kubenswrapper[4789]: E1216 08:46:22.039247 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="registry-server" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.039296 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="registry-server" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.040255 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfe4a1e-6406-44a8-b67b-d60a398201b9" containerName="registry-server" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.040289 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e556aea-3590-4797-a0f7-27cfbc22be03" containerName="configure-network-openstack-openstack-cell1" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.041075 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-4zxgp"] Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.041154 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.053683 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.053863 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.053986 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.054277 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.126898 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ssh-key\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.127086 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-inventory\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.127131 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xtlk\" (UniqueName: \"kubernetes.io/projected/ae3c5bd6-2381-4bd4-8567-f2fecac95765-kube-api-access-5xtlk\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.127189 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ceph\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.229137 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ssh-key\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.229256 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-inventory\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.229280 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xtlk\" (UniqueName: \"kubernetes.io/projected/ae3c5bd6-2381-4bd4-8567-f2fecac95765-kube-api-access-5xtlk\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.229321 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ceph\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.232483 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ssh-key\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.232493 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ceph\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.233472 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-inventory\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.247532 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xtlk\" (UniqueName: \"kubernetes.io/projected/ae3c5bd6-2381-4bd4-8567-f2fecac95765-kube-api-access-5xtlk\") pod \"validate-network-openstack-openstack-cell1-4zxgp\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.359460 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.870393 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-4zxgp"] Dec 16 08:46:22 crc kubenswrapper[4789]: I1216 08:46:22.956335 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" event={"ID":"ae3c5bd6-2381-4bd4-8567-f2fecac95765","Type":"ContainerStarted","Data":"c42b688476afc52be912073bc477150e8413f0ca69d7a1e2072d49d6e39f1c26"} Dec 16 08:46:23 crc kubenswrapper[4789]: I1216 08:46:23.966848 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" event={"ID":"ae3c5bd6-2381-4bd4-8567-f2fecac95765","Type":"ContainerStarted","Data":"f67619e39ffdb68e9c26fe4e15b29865bdb55014f88047cf0cea477df6b2ca33"} Dec 16 08:46:23 crc kubenswrapper[4789]: I1216 08:46:23.991168 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" podStartSLOduration=1.2831298850000001 podStartE2EDuration="1.991148728s" podCreationTimestamp="2025-12-16 08:46:22 +0000 UTC" firstStartedPulling="2025-12-16 08:46:22.880757209 +0000 UTC m=+6921.142644838" lastFinishedPulling="2025-12-16 08:46:23.588776052 +0000 UTC m=+6921.850663681" observedRunningTime="2025-12-16 08:46:23.982145199 +0000 UTC m=+6922.244032838" watchObservedRunningTime="2025-12-16 08:46:23.991148728 +0000 UTC m=+6922.253036357" Dec 16 08:46:29 crc kubenswrapper[4789]: I1216 08:46:29.012872 4789 generic.go:334] "Generic (PLEG): container finished" podID="ae3c5bd6-2381-4bd4-8567-f2fecac95765" containerID="f67619e39ffdb68e9c26fe4e15b29865bdb55014f88047cf0cea477df6b2ca33" exitCode=0 Dec 16 08:46:29 crc kubenswrapper[4789]: I1216 08:46:29.012963 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" event={"ID":"ae3c5bd6-2381-4bd4-8567-f2fecac95765","Type":"ContainerDied","Data":"f67619e39ffdb68e9c26fe4e15b29865bdb55014f88047cf0cea477df6b2ca33"} Dec 16 08:46:29 crc kubenswrapper[4789]: I1216 08:46:29.105110 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:46:29 crc kubenswrapper[4789]: E1216 08:46:29.105679 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.445887 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.606531 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ceph\") pod \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.607086 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ssh-key\") pod \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.607179 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xtlk\" (UniqueName: \"kubernetes.io/projected/ae3c5bd6-2381-4bd4-8567-f2fecac95765-kube-api-access-5xtlk\") pod \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.607348 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-inventory\") pod \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\" (UID: \"ae3c5bd6-2381-4bd4-8567-f2fecac95765\") " Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.613088 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ceph" (OuterVolumeSpecName: "ceph") pod "ae3c5bd6-2381-4bd4-8567-f2fecac95765" (UID: "ae3c5bd6-2381-4bd4-8567-f2fecac95765"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.621217 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3c5bd6-2381-4bd4-8567-f2fecac95765-kube-api-access-5xtlk" (OuterVolumeSpecName: "kube-api-access-5xtlk") pod "ae3c5bd6-2381-4bd4-8567-f2fecac95765" (UID: "ae3c5bd6-2381-4bd4-8567-f2fecac95765"). InnerVolumeSpecName "kube-api-access-5xtlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.648151 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-inventory" (OuterVolumeSpecName: "inventory") pod "ae3c5bd6-2381-4bd4-8567-f2fecac95765" (UID: "ae3c5bd6-2381-4bd4-8567-f2fecac95765"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.656313 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae3c5bd6-2381-4bd4-8567-f2fecac95765" (UID: "ae3c5bd6-2381-4bd4-8567-f2fecac95765"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.709967 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.710016 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xtlk\" (UniqueName: \"kubernetes.io/projected/ae3c5bd6-2381-4bd4-8567-f2fecac95765-kube-api-access-5xtlk\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.710031 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:30 crc kubenswrapper[4789]: I1216 08:46:30.710040 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ae3c5bd6-2381-4bd4-8567-f2fecac95765-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.030173 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" event={"ID":"ae3c5bd6-2381-4bd4-8567-f2fecac95765","Type":"ContainerDied","Data":"c42b688476afc52be912073bc477150e8413f0ca69d7a1e2072d49d6e39f1c26"} Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.030208 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42b688476afc52be912073bc477150e8413f0ca69d7a1e2072d49d6e39f1c26" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.030213 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4zxgp" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.118471 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-zhhd9"] Dec 16 08:46:31 crc kubenswrapper[4789]: E1216 08:46:31.125726 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3c5bd6-2381-4bd4-8567-f2fecac95765" containerName="validate-network-openstack-openstack-cell1" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.125772 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3c5bd6-2381-4bd4-8567-f2fecac95765" containerName="validate-network-openstack-openstack-cell1" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.126174 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3c5bd6-2381-4bd4-8567-f2fecac95765" containerName="validate-network-openstack-openstack-cell1" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.128249 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.131411 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.131545 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.131764 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.132574 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.136856 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-zhhd9"] Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.221470 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgnz\" (UniqueName: \"kubernetes.io/projected/451b6be9-d35b-4c1a-b4ce-448dcb086baf-kube-api-access-kmgnz\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.221538 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ceph\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.221587 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-inventory\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.221677 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ssh-key\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.323283 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgnz\" (UniqueName: \"kubernetes.io/projected/451b6be9-d35b-4c1a-b4ce-448dcb086baf-kube-api-access-kmgnz\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.323333 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ceph\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.323371 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-inventory\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.323454 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ssh-key\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.326755 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-inventory\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.328788 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ceph\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.329527 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ssh-key\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.338968 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgnz\" (UniqueName: \"kubernetes.io/projected/451b6be9-d35b-4c1a-b4ce-448dcb086baf-kube-api-access-kmgnz\") pod \"install-os-openstack-openstack-cell1-zhhd9\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.448648 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:46:31 crc kubenswrapper[4789]: I1216 08:46:31.968333 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-zhhd9"] Dec 16 08:46:31 crc kubenswrapper[4789]: W1216 08:46:31.969346 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451b6be9_d35b_4c1a_b4ce_448dcb086baf.slice/crio-883b7a94796d3e8bb781f8e6bc564be97c9a574fa292113e77425815f1d1de2b WatchSource:0}: Error finding container 883b7a94796d3e8bb781f8e6bc564be97c9a574fa292113e77425815f1d1de2b: Status 404 returned error can't find the container with id 883b7a94796d3e8bb781f8e6bc564be97c9a574fa292113e77425815f1d1de2b Dec 16 08:46:32 crc kubenswrapper[4789]: I1216 08:46:32.040439 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" event={"ID":"451b6be9-d35b-4c1a-b4ce-448dcb086baf","Type":"ContainerStarted","Data":"883b7a94796d3e8bb781f8e6bc564be97c9a574fa292113e77425815f1d1de2b"} Dec 16 08:46:33 crc kubenswrapper[4789]: I1216 08:46:33.052390 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" event={"ID":"451b6be9-d35b-4c1a-b4ce-448dcb086baf","Type":"ContainerStarted","Data":"fbe4e7bada142d571c5a57c75ca8f096b652794db0ef1d5b9e505f7c1d01b6fc"} Dec 16 08:46:33 crc kubenswrapper[4789]: I1216 08:46:33.070869 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" podStartSLOduration=1.4844119120000001 podStartE2EDuration="2.070850399s" podCreationTimestamp="2025-12-16 08:46:31 +0000 UTC" firstStartedPulling="2025-12-16 08:46:31.97156648 +0000 UTC m=+6930.233454109" lastFinishedPulling="2025-12-16 08:46:32.558004977 +0000 UTC m=+6930.819892596" observedRunningTime="2025-12-16 08:46:33.067195449 +0000 UTC m=+6931.329083088" watchObservedRunningTime="2025-12-16 08:46:33.070850399 +0000 UTC m=+6931.332738028" Dec 16 08:46:41 crc kubenswrapper[4789]: I1216 08:46:41.105508 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:46:41 crc kubenswrapper[4789]: E1216 08:46:41.106308 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:46:54 crc kubenswrapper[4789]: I1216 08:46:54.105774 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:46:54 crc kubenswrapper[4789]: E1216 08:46:54.107762 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:47:06 crc kubenswrapper[4789]: I1216 08:47:06.104818 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:47:06 crc kubenswrapper[4789]: E1216 08:47:06.105653 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.087893 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l2nxv"] Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.091046 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.118410 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2nxv"] Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.159785 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-utilities\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.160702 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-catalog-content\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.160764 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlf2f\" (UniqueName: \"kubernetes.io/projected/23804744-1531-4269-9fbe-556e5f42ecb1-kube-api-access-qlf2f\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.263130 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-utilities\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.263213 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-catalog-content\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.263245 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlf2f\" (UniqueName: \"kubernetes.io/projected/23804744-1531-4269-9fbe-556e5f42ecb1-kube-api-access-qlf2f\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.263770 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-utilities\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.263791 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-catalog-content\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.296480 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlf2f\" (UniqueName: \"kubernetes.io/projected/23804744-1531-4269-9fbe-556e5f42ecb1-kube-api-access-qlf2f\") pod \"redhat-marketplace-l2nxv\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.413264 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:13 crc kubenswrapper[4789]: I1216 08:47:13.966362 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2nxv"] Dec 16 08:47:14 crc kubenswrapper[4789]: I1216 08:47:14.392149 4789 generic.go:334] "Generic (PLEG): container finished" podID="23804744-1531-4269-9fbe-556e5f42ecb1" containerID="242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23" exitCode=0 Dec 16 08:47:14 crc kubenswrapper[4789]: I1216 08:47:14.392414 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2nxv" event={"ID":"23804744-1531-4269-9fbe-556e5f42ecb1","Type":"ContainerDied","Data":"242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23"} Dec 16 08:47:14 crc kubenswrapper[4789]: I1216 08:47:14.392438 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2nxv" event={"ID":"23804744-1531-4269-9fbe-556e5f42ecb1","Type":"ContainerStarted","Data":"a22e50ca71abc1e1f734448c4676e30bf9c483e45a10f4b5c9898339fed38be3"} Dec 16 08:47:16 crc kubenswrapper[4789]: I1216 08:47:16.412290 4789 generic.go:334] "Generic (PLEG): container finished" podID="23804744-1531-4269-9fbe-556e5f42ecb1" containerID="cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03" exitCode=0 Dec 16 08:47:16 crc kubenswrapper[4789]: I1216 08:47:16.412382 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2nxv" event={"ID":"23804744-1531-4269-9fbe-556e5f42ecb1","Type":"ContainerDied","Data":"cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03"} Dec 16 08:47:16 crc kubenswrapper[4789]: I1216 08:47:16.415701 4789 generic.go:334] "Generic (PLEG): container finished" podID="451b6be9-d35b-4c1a-b4ce-448dcb086baf" containerID="fbe4e7bada142d571c5a57c75ca8f096b652794db0ef1d5b9e505f7c1d01b6fc" exitCode=0 Dec 16 08:47:16 crc kubenswrapper[4789]: I1216 08:47:16.415750 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" event={"ID":"451b6be9-d35b-4c1a-b4ce-448dcb086baf","Type":"ContainerDied","Data":"fbe4e7bada142d571c5a57c75ca8f096b652794db0ef1d5b9e505f7c1d01b6fc"} Dec 16 08:47:17 crc kubenswrapper[4789]: I1216 08:47:17.426981 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2nxv" event={"ID":"23804744-1531-4269-9fbe-556e5f42ecb1","Type":"ContainerStarted","Data":"33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57"} Dec 16 08:47:17 crc kubenswrapper[4789]: I1216 08:47:17.444774 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l2nxv" podStartSLOduration=1.893163703 podStartE2EDuration="4.444753242s" podCreationTimestamp="2025-12-16 08:47:13 +0000 UTC" firstStartedPulling="2025-12-16 08:47:14.394246312 +0000 UTC m=+6972.656133941" lastFinishedPulling="2025-12-16 08:47:16.945835851 +0000 UTC m=+6975.207723480" observedRunningTime="2025-12-16 08:47:17.440428817 +0000 UTC m=+6975.702316446" watchObservedRunningTime="2025-12-16 08:47:17.444753242 +0000 UTC m=+6975.706640871" Dec 16 08:47:17 crc kubenswrapper[4789]: I1216 08:47:17.953946 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.060225 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ssh-key\") pod \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.060303 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmgnz\" (UniqueName: \"kubernetes.io/projected/451b6be9-d35b-4c1a-b4ce-448dcb086baf-kube-api-access-kmgnz\") pod \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.060443 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-inventory\") pod \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.060536 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ceph\") pod \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\" (UID: \"451b6be9-d35b-4c1a-b4ce-448dcb086baf\") " Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.067085 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ceph" (OuterVolumeSpecName: "ceph") pod "451b6be9-d35b-4c1a-b4ce-448dcb086baf" (UID: "451b6be9-d35b-4c1a-b4ce-448dcb086baf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.068053 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451b6be9-d35b-4c1a-b4ce-448dcb086baf-kube-api-access-kmgnz" (OuterVolumeSpecName: "kube-api-access-kmgnz") pod "451b6be9-d35b-4c1a-b4ce-448dcb086baf" (UID: "451b6be9-d35b-4c1a-b4ce-448dcb086baf"). InnerVolumeSpecName "kube-api-access-kmgnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.096791 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "451b6be9-d35b-4c1a-b4ce-448dcb086baf" (UID: "451b6be9-d35b-4c1a-b4ce-448dcb086baf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.106185 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:47:18 crc kubenswrapper[4789]: E1216 08:47:18.106644 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.118340 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-inventory" (OuterVolumeSpecName: "inventory") pod "451b6be9-d35b-4c1a-b4ce-448dcb086baf" (UID: "451b6be9-d35b-4c1a-b4ce-448dcb086baf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.163456 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.163500 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.163513 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmgnz\" (UniqueName: \"kubernetes.io/projected/451b6be9-d35b-4c1a-b4ce-448dcb086baf-kube-api-access-kmgnz\") on node \"crc\" DevicePath \"\"" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.163521 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451b6be9-d35b-4c1a-b4ce-448dcb086baf-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.434788 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" event={"ID":"451b6be9-d35b-4c1a-b4ce-448dcb086baf","Type":"ContainerDied","Data":"883b7a94796d3e8bb781f8e6bc564be97c9a574fa292113e77425815f1d1de2b"} Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.434839 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="883b7a94796d3e8bb781f8e6bc564be97c9a574fa292113e77425815f1d1de2b" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.434891 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zhhd9" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.557993 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-k4rq4"] Dec 16 08:47:18 crc kubenswrapper[4789]: E1216 08:47:18.558452 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451b6be9-d35b-4c1a-b4ce-448dcb086baf" containerName="install-os-openstack-openstack-cell1" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.558476 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="451b6be9-d35b-4c1a-b4ce-448dcb086baf" containerName="install-os-openstack-openstack-cell1" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.558735 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="451b6be9-d35b-4c1a-b4ce-448dcb086baf" containerName="install-os-openstack-openstack-cell1" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.559650 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.561906 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.562181 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.562292 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.562420 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.573824 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-k4rq4"] Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.674419 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvw9x\" (UniqueName: \"kubernetes.io/projected/408cd4a2-4575-49af-992b-a5f2dde363ef-kube-api-access-bvw9x\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.674595 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ssh-key\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.674747 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-inventory\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.674947 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ceph\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.777131 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvw9x\" (UniqueName: \"kubernetes.io/projected/408cd4a2-4575-49af-992b-a5f2dde363ef-kube-api-access-bvw9x\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.777215 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ssh-key\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.777264 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-inventory\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.777328 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ceph\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.780894 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-inventory\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.791349 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ssh-key\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.792383 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ceph\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.794098 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvw9x\" (UniqueName: \"kubernetes.io/projected/408cd4a2-4575-49af-992b-a5f2dde363ef-kube-api-access-bvw9x\") pod \"configure-os-openstack-openstack-cell1-k4rq4\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:18 crc kubenswrapper[4789]: I1216 08:47:18.889522 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:47:19 crc kubenswrapper[4789]: I1216 08:47:19.453644 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-k4rq4"] Dec 16 08:47:20 crc kubenswrapper[4789]: I1216 08:47:20.457294 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" event={"ID":"408cd4a2-4575-49af-992b-a5f2dde363ef","Type":"ContainerStarted","Data":"3a4629507b32f372d62434ee2a8ad2e0e1c8fdee524faad563e99b329abf2681"} Dec 16 08:47:20 crc kubenswrapper[4789]: I1216 08:47:20.457644 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" event={"ID":"408cd4a2-4575-49af-992b-a5f2dde363ef","Type":"ContainerStarted","Data":"955bf49aa526c97baafe1552e004a58c071b725aa751fa47171db29e54c28efa"} Dec 16 08:47:20 crc kubenswrapper[4789]: I1216 08:47:20.480673 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" podStartSLOduration=1.8978285879999999 podStartE2EDuration="2.480654277s" podCreationTimestamp="2025-12-16 08:47:18 +0000 UTC" firstStartedPulling="2025-12-16 08:47:19.462951649 +0000 UTC m=+6977.724839278" lastFinishedPulling="2025-12-16 08:47:20.045777338 +0000 UTC m=+6978.307664967" observedRunningTime="2025-12-16 08:47:20.472207051 +0000 UTC m=+6978.734094710" watchObservedRunningTime="2025-12-16 08:47:20.480654277 +0000 UTC m=+6978.742541906" Dec 16 08:47:23 crc kubenswrapper[4789]: I1216 08:47:23.414424 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:23 crc kubenswrapper[4789]: I1216 08:47:23.415200 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:23 crc kubenswrapper[4789]: I1216 08:47:23.458424 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:23 crc kubenswrapper[4789]: I1216 08:47:23.528209 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:23 crc kubenswrapper[4789]: I1216 08:47:23.700090 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2nxv"] Dec 16 08:47:25 crc kubenswrapper[4789]: I1216 08:47:25.506634 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l2nxv" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="registry-server" containerID="cri-o://33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57" gracePeriod=2 Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.017588 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.136605 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-catalog-content\") pod \"23804744-1531-4269-9fbe-556e5f42ecb1\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.137030 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-utilities\") pod \"23804744-1531-4269-9fbe-556e5f42ecb1\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.137305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlf2f\" (UniqueName: \"kubernetes.io/projected/23804744-1531-4269-9fbe-556e5f42ecb1-kube-api-access-qlf2f\") pod \"23804744-1531-4269-9fbe-556e5f42ecb1\" (UID: \"23804744-1531-4269-9fbe-556e5f42ecb1\") " Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.137852 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-utilities" (OuterVolumeSpecName: "utilities") pod "23804744-1531-4269-9fbe-556e5f42ecb1" (UID: "23804744-1531-4269-9fbe-556e5f42ecb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.143270 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23804744-1531-4269-9fbe-556e5f42ecb1-kube-api-access-qlf2f" (OuterVolumeSpecName: "kube-api-access-qlf2f") pod "23804744-1531-4269-9fbe-556e5f42ecb1" (UID: "23804744-1531-4269-9fbe-556e5f42ecb1"). InnerVolumeSpecName "kube-api-access-qlf2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.158491 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23804744-1531-4269-9fbe-556e5f42ecb1" (UID: "23804744-1531-4269-9fbe-556e5f42ecb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.240389 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.240426 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23804744-1531-4269-9fbe-556e5f42ecb1-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.240437 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlf2f\" (UniqueName: \"kubernetes.io/projected/23804744-1531-4269-9fbe-556e5f42ecb1-kube-api-access-qlf2f\") on node \"crc\" DevicePath \"\"" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.516949 4789 generic.go:334] "Generic (PLEG): container finished" podID="23804744-1531-4269-9fbe-556e5f42ecb1" containerID="33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57" exitCode=0 Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.516989 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2nxv" event={"ID":"23804744-1531-4269-9fbe-556e5f42ecb1","Type":"ContainerDied","Data":"33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57"} Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.517013 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2nxv" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.517020 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2nxv" event={"ID":"23804744-1531-4269-9fbe-556e5f42ecb1","Type":"ContainerDied","Data":"a22e50ca71abc1e1f734448c4676e30bf9c483e45a10f4b5c9898339fed38be3"} Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.517036 4789 scope.go:117] "RemoveContainer" containerID="33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.543561 4789 scope.go:117] "RemoveContainer" containerID="cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.550623 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2nxv"] Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.560619 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2nxv"] Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.565836 4789 scope.go:117] "RemoveContainer" containerID="242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.619789 4789 scope.go:117] "RemoveContainer" containerID="33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57" Dec 16 08:47:26 crc kubenswrapper[4789]: E1216 08:47:26.620259 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57\": container with ID starting with 33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57 not found: ID does not exist" containerID="33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.620295 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57"} err="failed to get container status \"33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57\": rpc error: code = NotFound desc = could not find container \"33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57\": container with ID starting with 33bc639d472649be84c6d213d08d502c6474c12322351d6a8d7fe4fb0fccbe57 not found: ID does not exist" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.620322 4789 scope.go:117] "RemoveContainer" containerID="cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03" Dec 16 08:47:26 crc kubenswrapper[4789]: E1216 08:47:26.620642 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03\": container with ID starting with cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03 not found: ID does not exist" containerID="cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.620672 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03"} err="failed to get container status \"cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03\": rpc error: code = NotFound desc = could not find container \"cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03\": container with ID starting with cb28c147ffd2b0b35ff39d7d3d5ce118d20cca6356490e2792166a31ed795a03 not found: ID does not exist" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.620692 4789 scope.go:117] "RemoveContainer" containerID="242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23" Dec 16 08:47:26 crc kubenswrapper[4789]: E1216 08:47:26.621077 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23\": container with ID starting with 242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23 not found: ID does not exist" containerID="242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23" Dec 16 08:47:26 crc kubenswrapper[4789]: I1216 08:47:26.621097 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23"} err="failed to get container status \"242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23\": rpc error: code = NotFound desc = could not find container \"242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23\": container with ID starting with 242f703154241f0dda1d66136d80e8e35b0e43aec6133f2cb3d4e0c141c8ed23 not found: ID does not exist" Dec 16 08:47:28 crc kubenswrapper[4789]: I1216 08:47:28.129744 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" path="/var/lib/kubelet/pods/23804744-1531-4269-9fbe-556e5f42ecb1/volumes" Dec 16 08:47:29 crc kubenswrapper[4789]: I1216 08:47:29.105400 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:47:29 crc kubenswrapper[4789]: E1216 08:47:29.106025 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:47:41 crc kubenswrapper[4789]: I1216 08:47:41.104881 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:47:41 crc kubenswrapper[4789]: E1216 08:47:41.105601 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:47:56 crc kubenswrapper[4789]: I1216 08:47:56.105375 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:47:56 crc kubenswrapper[4789]: E1216 08:47:56.106294 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:48:02 crc kubenswrapper[4789]: I1216 08:48:02.845483 4789 generic.go:334] "Generic (PLEG): container finished" podID="408cd4a2-4575-49af-992b-a5f2dde363ef" containerID="3a4629507b32f372d62434ee2a8ad2e0e1c8fdee524faad563e99b329abf2681" exitCode=0 Dec 16 08:48:02 crc kubenswrapper[4789]: I1216 08:48:02.845691 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" event={"ID":"408cd4a2-4575-49af-992b-a5f2dde363ef","Type":"ContainerDied","Data":"3a4629507b32f372d62434ee2a8ad2e0e1c8fdee524faad563e99b329abf2681"} Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.262845 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.404546 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvw9x\" (UniqueName: \"kubernetes.io/projected/408cd4a2-4575-49af-992b-a5f2dde363ef-kube-api-access-bvw9x\") pod \"408cd4a2-4575-49af-992b-a5f2dde363ef\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.404881 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ssh-key\") pod \"408cd4a2-4575-49af-992b-a5f2dde363ef\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.405540 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ceph\") pod \"408cd4a2-4575-49af-992b-a5f2dde363ef\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.405639 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-inventory\") pod \"408cd4a2-4575-49af-992b-a5f2dde363ef\" (UID: \"408cd4a2-4575-49af-992b-a5f2dde363ef\") " Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.409578 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408cd4a2-4575-49af-992b-a5f2dde363ef-kube-api-access-bvw9x" (OuterVolumeSpecName: "kube-api-access-bvw9x") pod "408cd4a2-4575-49af-992b-a5f2dde363ef" (UID: "408cd4a2-4575-49af-992b-a5f2dde363ef"). InnerVolumeSpecName "kube-api-access-bvw9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.409902 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ceph" (OuterVolumeSpecName: "ceph") pod "408cd4a2-4575-49af-992b-a5f2dde363ef" (UID: "408cd4a2-4575-49af-992b-a5f2dde363ef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.431377 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-inventory" (OuterVolumeSpecName: "inventory") pod "408cd4a2-4575-49af-992b-a5f2dde363ef" (UID: "408cd4a2-4575-49af-992b-a5f2dde363ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.433152 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "408cd4a2-4575-49af-992b-a5f2dde363ef" (UID: "408cd4a2-4575-49af-992b-a5f2dde363ef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.508652 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.508684 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.508696 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvw9x\" (UniqueName: \"kubernetes.io/projected/408cd4a2-4575-49af-992b-a5f2dde363ef-kube-api-access-bvw9x\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.508705 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/408cd4a2-4575-49af-992b-a5f2dde363ef-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.866450 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" event={"ID":"408cd4a2-4575-49af-992b-a5f2dde363ef","Type":"ContainerDied","Data":"955bf49aa526c97baafe1552e004a58c071b725aa751fa47171db29e54c28efa"} Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.866509 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955bf49aa526c97baafe1552e004a58c071b725aa751fa47171db29e54c28efa" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.866559 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-k4rq4" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.968540 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-527hf"] Dec 16 08:48:04 crc kubenswrapper[4789]: E1216 08:48:04.968982 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="extract-utilities" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.969000 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="extract-utilities" Dec 16 08:48:04 crc kubenswrapper[4789]: E1216 08:48:04.969033 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="extract-content" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.969040 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="extract-content" Dec 16 08:48:04 crc kubenswrapper[4789]: E1216 08:48:04.969053 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408cd4a2-4575-49af-992b-a5f2dde363ef" containerName="configure-os-openstack-openstack-cell1" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.969059 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="408cd4a2-4575-49af-992b-a5f2dde363ef" containerName="configure-os-openstack-openstack-cell1" Dec 16 08:48:04 crc kubenswrapper[4789]: E1216 08:48:04.969072 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="registry-server" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.969078 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="registry-server" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.969271 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="23804744-1531-4269-9fbe-556e5f42ecb1" containerName="registry-server" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.969291 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="408cd4a2-4575-49af-992b-a5f2dde363ef" containerName="configure-os-openstack-openstack-cell1" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.970707 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.975731 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.978541 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.978761 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.980004 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:48:04 crc kubenswrapper[4789]: I1216 08:48:04.990101 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-527hf"] Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.124907 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8cb\" (UniqueName: \"kubernetes.io/projected/5405dbb7-1841-4e80-a4a4-08513cb61917-kube-api-access-jx8cb\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.125113 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ceph\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.125167 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-inventory-0\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.125255 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.226576 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.226930 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8cb\" (UniqueName: \"kubernetes.io/projected/5405dbb7-1841-4e80-a4a4-08513cb61917-kube-api-access-jx8cb\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.227129 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ceph\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.227275 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-inventory-0\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.233695 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.235283 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-inventory-0\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.238470 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ceph\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.245168 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8cb\" (UniqueName: \"kubernetes.io/projected/5405dbb7-1841-4e80-a4a4-08513cb61917-kube-api-access-jx8cb\") pod \"ssh-known-hosts-openstack-527hf\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.290299 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.789118 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-527hf"] Dec 16 08:48:05 crc kubenswrapper[4789]: I1216 08:48:05.875243 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-527hf" event={"ID":"5405dbb7-1841-4e80-a4a4-08513cb61917","Type":"ContainerStarted","Data":"aa6a7bae876c81f31cc506fe83c9d17196fcad8fb74bfb7b10f4f3ffbe693219"} Dec 16 08:48:08 crc kubenswrapper[4789]: I1216 08:48:08.105677 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:48:08 crc kubenswrapper[4789]: E1216 08:48:08.106551 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:48:08 crc kubenswrapper[4789]: I1216 08:48:08.903400 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-527hf" event={"ID":"5405dbb7-1841-4e80-a4a4-08513cb61917","Type":"ContainerStarted","Data":"e391f70e0c05e2f3bfc5e9da87b1c6765a1ba1ee9e50b0677f5ee0a03ba0ee00"} Dec 16 08:48:08 crc kubenswrapper[4789]: I1216 08:48:08.919235 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-527hf" podStartSLOduration=2.798570119 podStartE2EDuration="4.919217754s" podCreationTimestamp="2025-12-16 08:48:04 +0000 UTC" firstStartedPulling="2025-12-16 08:48:05.795027916 +0000 UTC m=+7024.056915545" lastFinishedPulling="2025-12-16 08:48:07.915675551 +0000 UTC m=+7026.177563180" observedRunningTime="2025-12-16 08:48:08.917410691 +0000 UTC m=+7027.179298330" watchObservedRunningTime="2025-12-16 08:48:08.919217754 +0000 UTC m=+7027.181105383" Dec 16 08:48:17 crc kubenswrapper[4789]: I1216 08:48:17.980029 4789 generic.go:334] "Generic (PLEG): container finished" podID="5405dbb7-1841-4e80-a4a4-08513cb61917" containerID="e391f70e0c05e2f3bfc5e9da87b1c6765a1ba1ee9e50b0677f5ee0a03ba0ee00" exitCode=0 Dec 16 08:48:17 crc kubenswrapper[4789]: I1216 08:48:17.980109 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-527hf" event={"ID":"5405dbb7-1841-4e80-a4a4-08513cb61917","Type":"ContainerDied","Data":"e391f70e0c05e2f3bfc5e9da87b1c6765a1ba1ee9e50b0677f5ee0a03ba0ee00"} Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.390264 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.530073 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-inventory-0\") pod \"5405dbb7-1841-4e80-a4a4-08513cb61917\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.530135 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx8cb\" (UniqueName: \"kubernetes.io/projected/5405dbb7-1841-4e80-a4a4-08513cb61917-kube-api-access-jx8cb\") pod \"5405dbb7-1841-4e80-a4a4-08513cb61917\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.530252 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ssh-key-openstack-cell1\") pod \"5405dbb7-1841-4e80-a4a4-08513cb61917\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.530336 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ceph\") pod \"5405dbb7-1841-4e80-a4a4-08513cb61917\" (UID: \"5405dbb7-1841-4e80-a4a4-08513cb61917\") " Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.535374 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ceph" (OuterVolumeSpecName: "ceph") pod "5405dbb7-1841-4e80-a4a4-08513cb61917" (UID: "5405dbb7-1841-4e80-a4a4-08513cb61917"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.535637 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5405dbb7-1841-4e80-a4a4-08513cb61917-kube-api-access-jx8cb" (OuterVolumeSpecName: "kube-api-access-jx8cb") pod "5405dbb7-1841-4e80-a4a4-08513cb61917" (UID: "5405dbb7-1841-4e80-a4a4-08513cb61917"). InnerVolumeSpecName "kube-api-access-jx8cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.556467 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5405dbb7-1841-4e80-a4a4-08513cb61917" (UID: "5405dbb7-1841-4e80-a4a4-08513cb61917"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.561564 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5405dbb7-1841-4e80-a4a4-08513cb61917" (UID: "5405dbb7-1841-4e80-a4a4-08513cb61917"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.632797 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.632831 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.632843 4789 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5405dbb7-1841-4e80-a4a4-08513cb61917-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:19 crc kubenswrapper[4789]: I1216 08:48:19.632853 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx8cb\" (UniqueName: \"kubernetes.io/projected/5405dbb7-1841-4e80-a4a4-08513cb61917-kube-api-access-jx8cb\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.034421 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-527hf" event={"ID":"5405dbb7-1841-4e80-a4a4-08513cb61917","Type":"ContainerDied","Data":"aa6a7bae876c81f31cc506fe83c9d17196fcad8fb74bfb7b10f4f3ffbe693219"} Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.034746 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa6a7bae876c81f31cc506fe83c9d17196fcad8fb74bfb7b10f4f3ffbe693219" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.034684 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-527hf" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.085163 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-hsc2c"] Dec 16 08:48:20 crc kubenswrapper[4789]: E1216 08:48:20.085643 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5405dbb7-1841-4e80-a4a4-08513cb61917" containerName="ssh-known-hosts-openstack" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.085663 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5405dbb7-1841-4e80-a4a4-08513cb61917" containerName="ssh-known-hosts-openstack" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.085873 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5405dbb7-1841-4e80-a4a4-08513cb61917" containerName="ssh-known-hosts-openstack" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.086620 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.097590 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.097773 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.097876 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.098014 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.120548 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-hsc2c"] Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.245388 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ceph\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.245722 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-inventory\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.246037 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ssh-key\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.247115 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr75k\" (UniqueName: \"kubernetes.io/projected/d5ff0c7a-b121-4c2d-a17e-acb58761e419-kube-api-access-qr75k\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.349714 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ssh-key\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.349818 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr75k\" (UniqueName: \"kubernetes.io/projected/d5ff0c7a-b121-4c2d-a17e-acb58761e419-kube-api-access-qr75k\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.349933 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ceph\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.349999 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-inventory\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.354482 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ssh-key\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.366645 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ceph\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.366645 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-inventory\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.370319 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr75k\" (UniqueName: \"kubernetes.io/projected/d5ff0c7a-b121-4c2d-a17e-acb58761e419-kube-api-access-qr75k\") pod \"run-os-openstack-openstack-cell1-hsc2c\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.422304 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:20 crc kubenswrapper[4789]: I1216 08:48:20.924355 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-hsc2c"] Dec 16 08:48:20 crc kubenswrapper[4789]: W1216 08:48:20.925857 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ff0c7a_b121_4c2d_a17e_acb58761e419.slice/crio-d17e240665c1f1e00148cf1f8e78276497e9e7877d974ef137d57988cca8406a WatchSource:0}: Error finding container d17e240665c1f1e00148cf1f8e78276497e9e7877d974ef137d57988cca8406a: Status 404 returned error can't find the container with id d17e240665c1f1e00148cf1f8e78276497e9e7877d974ef137d57988cca8406a Dec 16 08:48:21 crc kubenswrapper[4789]: I1216 08:48:21.043142 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" event={"ID":"d5ff0c7a-b121-4c2d-a17e-acb58761e419","Type":"ContainerStarted","Data":"d17e240665c1f1e00148cf1f8e78276497e9e7877d974ef137d57988cca8406a"} Dec 16 08:48:21 crc kubenswrapper[4789]: I1216 08:48:21.106258 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:48:21 crc kubenswrapper[4789]: E1216 08:48:21.106573 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:48:22 crc kubenswrapper[4789]: I1216 08:48:22.062181 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" event={"ID":"d5ff0c7a-b121-4c2d-a17e-acb58761e419","Type":"ContainerStarted","Data":"0ed86304e46719cc812ffaa9daaa35cf5a3badecaf81ef5106ec03a04e735398"} Dec 16 08:48:22 crc kubenswrapper[4789]: I1216 08:48:22.078696 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" podStartSLOduration=1.330404162 podStartE2EDuration="2.078674287s" podCreationTimestamp="2025-12-16 08:48:20 +0000 UTC" firstStartedPulling="2025-12-16 08:48:20.928160578 +0000 UTC m=+7039.190048207" lastFinishedPulling="2025-12-16 08:48:21.676430703 +0000 UTC m=+7039.938318332" observedRunningTime="2025-12-16 08:48:22.07471841 +0000 UTC m=+7040.336606039" watchObservedRunningTime="2025-12-16 08:48:22.078674287 +0000 UTC m=+7040.340561916" Dec 16 08:48:29 crc kubenswrapper[4789]: I1216 08:48:29.118823 4789 generic.go:334] "Generic (PLEG): container finished" podID="d5ff0c7a-b121-4c2d-a17e-acb58761e419" containerID="0ed86304e46719cc812ffaa9daaa35cf5a3badecaf81ef5106ec03a04e735398" exitCode=0 Dec 16 08:48:29 crc kubenswrapper[4789]: I1216 08:48:29.118943 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" event={"ID":"d5ff0c7a-b121-4c2d-a17e-acb58761e419","Type":"ContainerDied","Data":"0ed86304e46719cc812ffaa9daaa35cf5a3badecaf81ef5106ec03a04e735398"} Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.582998 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.756313 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ssh-key\") pod \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.757969 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr75k\" (UniqueName: \"kubernetes.io/projected/d5ff0c7a-b121-4c2d-a17e-acb58761e419-kube-api-access-qr75k\") pod \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.758111 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-inventory\") pod \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.758639 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ceph\") pod \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\" (UID: \"d5ff0c7a-b121-4c2d-a17e-acb58761e419\") " Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.763203 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ceph" (OuterVolumeSpecName: "ceph") pod "d5ff0c7a-b121-4c2d-a17e-acb58761e419" (UID: "d5ff0c7a-b121-4c2d-a17e-acb58761e419"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.763469 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ff0c7a-b121-4c2d-a17e-acb58761e419-kube-api-access-qr75k" (OuterVolumeSpecName: "kube-api-access-qr75k") pod "d5ff0c7a-b121-4c2d-a17e-acb58761e419" (UID: "d5ff0c7a-b121-4c2d-a17e-acb58761e419"). InnerVolumeSpecName "kube-api-access-qr75k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.803356 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-inventory" (OuterVolumeSpecName: "inventory") pod "d5ff0c7a-b121-4c2d-a17e-acb58761e419" (UID: "d5ff0c7a-b121-4c2d-a17e-acb58761e419"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.806650 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5ff0c7a-b121-4c2d-a17e-acb58761e419" (UID: "d5ff0c7a-b121-4c2d-a17e-acb58761e419"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.862358 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.862413 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr75k\" (UniqueName: \"kubernetes.io/projected/d5ff0c7a-b121-4c2d-a17e-acb58761e419-kube-api-access-qr75k\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.862428 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:30 crc kubenswrapper[4789]: I1216 08:48:30.862438 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5ff0c7a-b121-4c2d-a17e-acb58761e419-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.139138 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" event={"ID":"d5ff0c7a-b121-4c2d-a17e-acb58761e419","Type":"ContainerDied","Data":"d17e240665c1f1e00148cf1f8e78276497e9e7877d974ef137d57988cca8406a"} Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.139180 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d17e240665c1f1e00148cf1f8e78276497e9e7877d974ef137d57988cca8406a" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.139230 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-hsc2c" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.207457 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-6m8gq"] Dec 16 08:48:31 crc kubenswrapper[4789]: E1216 08:48:31.208077 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ff0c7a-b121-4c2d-a17e-acb58761e419" containerName="run-os-openstack-openstack-cell1" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.208097 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ff0c7a-b121-4c2d-a17e-acb58761e419" containerName="run-os-openstack-openstack-cell1" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.208344 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ff0c7a-b121-4c2d-a17e-acb58761e419" containerName="run-os-openstack-openstack-cell1" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.209323 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.213415 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.213739 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.215962 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.217244 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.218180 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-6m8gq"] Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.373597 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.374083 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89tvr\" (UniqueName: \"kubernetes.io/projected/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-kube-api-access-89tvr\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.374190 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ceph\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.374285 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-inventory\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.476616 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-inventory\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.476697 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.476779 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89tvr\" (UniqueName: \"kubernetes.io/projected/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-kube-api-access-89tvr\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.476831 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ceph\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.490867 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ceph\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.490867 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-inventory\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.491474 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.495369 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89tvr\" (UniqueName: \"kubernetes.io/projected/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-kube-api-access-89tvr\") pod \"reboot-os-openstack-openstack-cell1-6m8gq\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:31 crc kubenswrapper[4789]: I1216 08:48:31.528385 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:32 crc kubenswrapper[4789]: I1216 08:48:32.064286 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-6m8gq"] Dec 16 08:48:32 crc kubenswrapper[4789]: I1216 08:48:32.113133 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:48:32 crc kubenswrapper[4789]: I1216 08:48:32.162135 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" event={"ID":"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c","Type":"ContainerStarted","Data":"7ce5f3635d15cf9dc9c02b8470f0b1a578ce6872a8d5c19524983e47b1b73684"} Dec 16 08:48:33 crc kubenswrapper[4789]: I1216 08:48:33.174689 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"77fffb00e7e434d3555e3b3538fc62383b6e955440c7beef42ce88d64343310d"} Dec 16 08:48:33 crc kubenswrapper[4789]: I1216 08:48:33.190815 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" event={"ID":"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c","Type":"ContainerStarted","Data":"615440234c9b3994a9bedece9299185198a523d985c02aa6e4acc4f802b004fa"} Dec 16 08:48:33 crc kubenswrapper[4789]: I1216 08:48:33.231448 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" podStartSLOduration=1.589273374 podStartE2EDuration="2.231423381s" podCreationTimestamp="2025-12-16 08:48:31 +0000 UTC" firstStartedPulling="2025-12-16 08:48:32.060630778 +0000 UTC m=+7050.322518407" lastFinishedPulling="2025-12-16 08:48:32.702780775 +0000 UTC m=+7050.964668414" observedRunningTime="2025-12-16 08:48:33.219524791 +0000 UTC m=+7051.481412430" watchObservedRunningTime="2025-12-16 08:48:33.231423381 +0000 UTC m=+7051.493311010" Dec 16 08:48:49 crc kubenswrapper[4789]: I1216 08:48:49.317585 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" containerID="615440234c9b3994a9bedece9299185198a523d985c02aa6e4acc4f802b004fa" exitCode=0 Dec 16 08:48:49 crc kubenswrapper[4789]: I1216 08:48:49.317684 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" event={"ID":"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c","Type":"ContainerDied","Data":"615440234c9b3994a9bedece9299185198a523d985c02aa6e4acc4f802b004fa"} Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.740722 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.899724 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-inventory\") pod \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.899814 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ssh-key\") pod \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.899859 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89tvr\" (UniqueName: \"kubernetes.io/projected/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-kube-api-access-89tvr\") pod \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.899885 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ceph\") pod \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\" (UID: \"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c\") " Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.905349 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ceph" (OuterVolumeSpecName: "ceph") pod "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" (UID: "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.905426 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-kube-api-access-89tvr" (OuterVolumeSpecName: "kube-api-access-89tvr") pod "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" (UID: "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c"). InnerVolumeSpecName "kube-api-access-89tvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.927216 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" (UID: "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:50 crc kubenswrapper[4789]: I1216 08:48:50.930023 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-inventory" (OuterVolumeSpecName: "inventory") pod "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" (UID: "ea1d73ed-d948-4c5a-bda3-c4f13fc0572c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.003563 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.003607 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.003622 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89tvr\" (UniqueName: \"kubernetes.io/projected/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-kube-api-access-89tvr\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.003638 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea1d73ed-d948-4c5a-bda3-c4f13fc0572c-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.341561 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" event={"ID":"ea1d73ed-d948-4c5a-bda3-c4f13fc0572c","Type":"ContainerDied","Data":"7ce5f3635d15cf9dc9c02b8470f0b1a578ce6872a8d5c19524983e47b1b73684"} Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.342070 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ce5f3635d15cf9dc9c02b8470f0b1a578ce6872a8d5c19524983e47b1b73684" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.341619 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-6m8gq" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.469352 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-2mkth"] Dec 16 08:48:51 crc kubenswrapper[4789]: E1216 08:48:51.469867 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" containerName="reboot-os-openstack-openstack-cell1" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.469887 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" containerName="reboot-os-openstack-openstack-cell1" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.470124 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1d73ed-d948-4c5a-bda3-c4f13fc0572c" containerName="reboot-os-openstack-openstack-cell1" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.470990 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.474549 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.474618 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.475576 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.475556 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.486849 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-2mkth"] Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.624406 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.624597 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.624739 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.624798 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ceph\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.624838 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ssh-key\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.624866 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.624903 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.625004 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.625038 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.625095 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.625157 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-inventory\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.625190 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfpvp\" (UniqueName: \"kubernetes.io/projected/839706dd-4b2b-4821-9d5b-374e1f23f6bf-kube-api-access-tfpvp\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727023 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727106 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727132 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727171 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727199 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-inventory\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727215 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfpvp\" (UniqueName: \"kubernetes.io/projected/839706dd-4b2b-4821-9d5b-374e1f23f6bf-kube-api-access-tfpvp\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727260 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727290 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727330 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727353 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ceph\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727378 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ssh-key\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.727401 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.732587 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ssh-key\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.732836 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.732997 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-inventory\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.733412 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ceph\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.733851 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.736591 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.737872 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.738800 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.741179 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.747438 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.747636 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.749075 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfpvp\" (UniqueName: \"kubernetes.io/projected/839706dd-4b2b-4821-9d5b-374e1f23f6bf-kube-api-access-tfpvp\") pod \"install-certs-openstack-openstack-cell1-2mkth\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:51 crc kubenswrapper[4789]: I1216 08:48:51.793124 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:48:52 crc kubenswrapper[4789]: I1216 08:48:52.331471 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-2mkth"] Dec 16 08:48:52 crc kubenswrapper[4789]: I1216 08:48:52.349869 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" event={"ID":"839706dd-4b2b-4821-9d5b-374e1f23f6bf","Type":"ContainerStarted","Data":"324a72c54040a4cce4e45782865e9578c950704ecc0aa146bc29ab53fa9e80ff"} Dec 16 08:48:55 crc kubenswrapper[4789]: I1216 08:48:55.381212 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" event={"ID":"839706dd-4b2b-4821-9d5b-374e1f23f6bf","Type":"ContainerStarted","Data":"e08a4f041529038407bd30412658e904db805cfedbb22e3a8cd508181e7c7a1a"} Dec 16 08:48:55 crc kubenswrapper[4789]: I1216 08:48:55.410521 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" podStartSLOduration=2.444509865 podStartE2EDuration="4.410502758s" podCreationTimestamp="2025-12-16 08:48:51 +0000 UTC" firstStartedPulling="2025-12-16 08:48:52.33306784 +0000 UTC m=+7070.594955469" lastFinishedPulling="2025-12-16 08:48:54.299060733 +0000 UTC m=+7072.560948362" observedRunningTime="2025-12-16 08:48:55.400038132 +0000 UTC m=+7073.661925771" watchObservedRunningTime="2025-12-16 08:48:55.410502758 +0000 UTC m=+7073.672390387" Dec 16 08:49:12 crc kubenswrapper[4789]: I1216 08:49:12.525493 4789 generic.go:334] "Generic (PLEG): container finished" podID="839706dd-4b2b-4821-9d5b-374e1f23f6bf" containerID="e08a4f041529038407bd30412658e904db805cfedbb22e3a8cd508181e7c7a1a" exitCode=0 Dec 16 08:49:12 crc kubenswrapper[4789]: I1216 08:49:12.525566 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" event={"ID":"839706dd-4b2b-4821-9d5b-374e1f23f6bf","Type":"ContainerDied","Data":"e08a4f041529038407bd30412658e904db805cfedbb22e3a8cd508181e7c7a1a"} Dec 16 08:49:13 crc kubenswrapper[4789]: I1216 08:49:13.957723 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.140095 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ovn-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.140389 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-libvirt-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141024 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-telemetry-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141134 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ceph\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141219 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-metadata-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141302 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-dhcp-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141409 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfpvp\" (UniqueName: \"kubernetes.io/projected/839706dd-4b2b-4821-9d5b-374e1f23f6bf-kube-api-access-tfpvp\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141514 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-bootstrap-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141705 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-sriov-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.141858 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ssh-key\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.142014 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-inventory\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.142181 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-nova-combined-ca-bundle\") pod \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\" (UID: \"839706dd-4b2b-4821-9d5b-374e1f23f6bf\") " Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.146468 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.146942 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.147051 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.148154 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839706dd-4b2b-4821-9d5b-374e1f23f6bf-kube-api-access-tfpvp" (OuterVolumeSpecName: "kube-api-access-tfpvp") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "kube-api-access-tfpvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.157089 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.157174 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.157206 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.157992 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ceph" (OuterVolumeSpecName: "ceph") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.158271 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.160109 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.177618 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.184302 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-inventory" (OuterVolumeSpecName: "inventory") pod "839706dd-4b2b-4821-9d5b-374e1f23f6bf" (UID: "839706dd-4b2b-4821-9d5b-374e1f23f6bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244842 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244884 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244897 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244910 4789 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244941 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244953 4789 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244965 4789 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244979 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.244990 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.245000 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.245103 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfpvp\" (UniqueName: \"kubernetes.io/projected/839706dd-4b2b-4821-9d5b-374e1f23f6bf-kube-api-access-tfpvp\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.245200 4789 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839706dd-4b2b-4821-9d5b-374e1f23f6bf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.549417 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" event={"ID":"839706dd-4b2b-4821-9d5b-374e1f23f6bf","Type":"ContainerDied","Data":"324a72c54040a4cce4e45782865e9578c950704ecc0aa146bc29ab53fa9e80ff"} Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.549456 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="324a72c54040a4cce4e45782865e9578c950704ecc0aa146bc29ab53fa9e80ff" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.549507 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-2mkth" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.647975 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vltnx"] Dec 16 08:49:14 crc kubenswrapper[4789]: E1216 08:49:14.648453 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839706dd-4b2b-4821-9d5b-374e1f23f6bf" containerName="install-certs-openstack-openstack-cell1" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.648470 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="839706dd-4b2b-4821-9d5b-374e1f23f6bf" containerName="install-certs-openstack-openstack-cell1" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.648675 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="839706dd-4b2b-4821-9d5b-374e1f23f6bf" containerName="install-certs-openstack-openstack-cell1" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.649452 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.652111 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.652471 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.652650 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.652838 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.663062 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vltnx"] Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.778135 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqlfl\" (UniqueName: \"kubernetes.io/projected/19b99655-7a2f-4367-9b3f-c0897a02bed3-kube-api-access-mqlfl\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.778521 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-inventory\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.778697 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.778726 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ceph\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.880721 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.880762 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ceph\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.880842 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqlfl\" (UniqueName: \"kubernetes.io/projected/19b99655-7a2f-4367-9b3f-c0897a02bed3-kube-api-access-mqlfl\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.880895 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-inventory\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.886069 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-inventory\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.886605 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.895304 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ceph\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.898444 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqlfl\" (UniqueName: \"kubernetes.io/projected/19b99655-7a2f-4367-9b3f-c0897a02bed3-kube-api-access-mqlfl\") pod \"ceph-client-openstack-openstack-cell1-vltnx\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:14 crc kubenswrapper[4789]: I1216 08:49:14.983084 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:15 crc kubenswrapper[4789]: I1216 08:49:15.570693 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vltnx"] Dec 16 08:49:16 crc kubenswrapper[4789]: I1216 08:49:16.566640 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" event={"ID":"19b99655-7a2f-4367-9b3f-c0897a02bed3","Type":"ContainerStarted","Data":"d740aa3a23be9db2a610e5f7d0956417e4734f1ccb43cb0b46ae60bd252164c9"} Dec 16 08:49:16 crc kubenswrapper[4789]: I1216 08:49:16.568775 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" event={"ID":"19b99655-7a2f-4367-9b3f-c0897a02bed3","Type":"ContainerStarted","Data":"b640d9aa5cd5de91c129dac64080212fa5a9191605d551b3adc7f5fb6982a7a4"} Dec 16 08:49:16 crc kubenswrapper[4789]: I1216 08:49:16.593035 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" podStartSLOduration=2.164873912 podStartE2EDuration="2.593002507s" podCreationTimestamp="2025-12-16 08:49:14 +0000 UTC" firstStartedPulling="2025-12-16 08:49:15.574750755 +0000 UTC m=+7093.836638384" lastFinishedPulling="2025-12-16 08:49:16.00287936 +0000 UTC m=+7094.264766979" observedRunningTime="2025-12-16 08:49:16.586183791 +0000 UTC m=+7094.848071420" watchObservedRunningTime="2025-12-16 08:49:16.593002507 +0000 UTC m=+7094.854890136" Dec 16 08:49:21 crc kubenswrapper[4789]: I1216 08:49:21.609966 4789 generic.go:334] "Generic (PLEG): container finished" podID="19b99655-7a2f-4367-9b3f-c0897a02bed3" containerID="d740aa3a23be9db2a610e5f7d0956417e4734f1ccb43cb0b46ae60bd252164c9" exitCode=0 Dec 16 08:49:21 crc kubenswrapper[4789]: I1216 08:49:21.610055 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" event={"ID":"19b99655-7a2f-4367-9b3f-c0897a02bed3","Type":"ContainerDied","Data":"d740aa3a23be9db2a610e5f7d0956417e4734f1ccb43cb0b46ae60bd252164c9"} Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.100841 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.246216 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqlfl\" (UniqueName: \"kubernetes.io/projected/19b99655-7a2f-4367-9b3f-c0897a02bed3-kube-api-access-mqlfl\") pod \"19b99655-7a2f-4367-9b3f-c0897a02bed3\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.246285 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-inventory\") pod \"19b99655-7a2f-4367-9b3f-c0897a02bed3\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.246362 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ceph\") pod \"19b99655-7a2f-4367-9b3f-c0897a02bed3\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.246380 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ssh-key\") pod \"19b99655-7a2f-4367-9b3f-c0897a02bed3\" (UID: \"19b99655-7a2f-4367-9b3f-c0897a02bed3\") " Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.251432 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ceph" (OuterVolumeSpecName: "ceph") pod "19b99655-7a2f-4367-9b3f-c0897a02bed3" (UID: "19b99655-7a2f-4367-9b3f-c0897a02bed3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.258439 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b99655-7a2f-4367-9b3f-c0897a02bed3-kube-api-access-mqlfl" (OuterVolumeSpecName: "kube-api-access-mqlfl") pod "19b99655-7a2f-4367-9b3f-c0897a02bed3" (UID: "19b99655-7a2f-4367-9b3f-c0897a02bed3"). InnerVolumeSpecName "kube-api-access-mqlfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.277235 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-inventory" (OuterVolumeSpecName: "inventory") pod "19b99655-7a2f-4367-9b3f-c0897a02bed3" (UID: "19b99655-7a2f-4367-9b3f-c0897a02bed3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.277568 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "19b99655-7a2f-4367-9b3f-c0897a02bed3" (UID: "19b99655-7a2f-4367-9b3f-c0897a02bed3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.350005 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqlfl\" (UniqueName: \"kubernetes.io/projected/19b99655-7a2f-4367-9b3f-c0897a02bed3-kube-api-access-mqlfl\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.350045 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.350113 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.350126 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/19b99655-7a2f-4367-9b3f-c0897a02bed3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.631002 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" event={"ID":"19b99655-7a2f-4367-9b3f-c0897a02bed3","Type":"ContainerDied","Data":"b640d9aa5cd5de91c129dac64080212fa5a9191605d551b3adc7f5fb6982a7a4"} Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.631059 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b640d9aa5cd5de91c129dac64080212fa5a9191605d551b3adc7f5fb6982a7a4" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.631062 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vltnx" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.706597 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-ppzs5"] Dec 16 08:49:23 crc kubenswrapper[4789]: E1216 08:49:23.707119 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b99655-7a2f-4367-9b3f-c0897a02bed3" containerName="ceph-client-openstack-openstack-cell1" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.707137 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b99655-7a2f-4367-9b3f-c0897a02bed3" containerName="ceph-client-openstack-openstack-cell1" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.707374 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b99655-7a2f-4367-9b3f-c0897a02bed3" containerName="ceph-client-openstack-openstack-cell1" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.708146 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.710313 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.710702 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.711041 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.711592 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.716682 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.718957 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-ppzs5"] Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.860503 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ceph\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.860891 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.860945 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.861023 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-inventory\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.861272 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftn2f\" (UniqueName: \"kubernetes.io/projected/e08f18e5-cd25-40b5-a8fa-2af2530846f4-kube-api-access-ftn2f\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.861383 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.962764 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftn2f\" (UniqueName: \"kubernetes.io/projected/e08f18e5-cd25-40b5-a8fa-2af2530846f4-kube-api-access-ftn2f\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.962830 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.962895 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ceph\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.962969 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.962999 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.963042 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-inventory\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.963982 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.967034 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-inventory\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.967131 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.967170 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.967802 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ceph\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:23 crc kubenswrapper[4789]: I1216 08:49:23.979992 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftn2f\" (UniqueName: \"kubernetes.io/projected/e08f18e5-cd25-40b5-a8fa-2af2530846f4-kube-api-access-ftn2f\") pod \"ovn-openstack-openstack-cell1-ppzs5\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:24 crc kubenswrapper[4789]: I1216 08:49:24.027791 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:49:24 crc kubenswrapper[4789]: I1216 08:49:24.526361 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-ppzs5"] Dec 16 08:49:24 crc kubenswrapper[4789]: I1216 08:49:24.640748 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" event={"ID":"e08f18e5-cd25-40b5-a8fa-2af2530846f4","Type":"ContainerStarted","Data":"b6af15d7134b25542814a0e5ab4a8ea918364f2ec85f351a66e41770e13b6627"} Dec 16 08:49:26 crc kubenswrapper[4789]: I1216 08:49:26.660709 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" event={"ID":"e08f18e5-cd25-40b5-a8fa-2af2530846f4","Type":"ContainerStarted","Data":"a49cc763977aabff30fdd9c6891ab73f09d8759bc72bd622b7e27044cfcd3d58"} Dec 16 08:49:26 crc kubenswrapper[4789]: I1216 08:49:26.679808 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" podStartSLOduration=2.393926798 podStartE2EDuration="3.679793697s" podCreationTimestamp="2025-12-16 08:49:23 +0000 UTC" firstStartedPulling="2025-12-16 08:49:24.531110537 +0000 UTC m=+7102.792998166" lastFinishedPulling="2025-12-16 08:49:25.816977436 +0000 UTC m=+7104.078865065" observedRunningTime="2025-12-16 08:49:26.676784373 +0000 UTC m=+7104.938672002" watchObservedRunningTime="2025-12-16 08:49:26.679793697 +0000 UTC m=+7104.941681326" Dec 16 08:50:30 crc kubenswrapper[4789]: I1216 08:50:30.250776 4789 generic.go:334] "Generic (PLEG): container finished" podID="e08f18e5-cd25-40b5-a8fa-2af2530846f4" containerID="a49cc763977aabff30fdd9c6891ab73f09d8759bc72bd622b7e27044cfcd3d58" exitCode=0 Dec 16 08:50:30 crc kubenswrapper[4789]: I1216 08:50:30.250993 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" event={"ID":"e08f18e5-cd25-40b5-a8fa-2af2530846f4","Type":"ContainerDied","Data":"a49cc763977aabff30fdd9c6891ab73f09d8759bc72bd622b7e27044cfcd3d58"} Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.820497 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.986889 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-inventory\") pod \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.987075 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovn-combined-ca-bundle\") pod \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.987157 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftn2f\" (UniqueName: \"kubernetes.io/projected/e08f18e5-cd25-40b5-a8fa-2af2530846f4-kube-api-access-ftn2f\") pod \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.987200 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ceph\") pod \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.987222 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key\") pod \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.987302 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovncontroller-config-0\") pod \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.994732 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ceph" (OuterVolumeSpecName: "ceph") pod "e08f18e5-cd25-40b5-a8fa-2af2530846f4" (UID: "e08f18e5-cd25-40b5-a8fa-2af2530846f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.994889 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08f18e5-cd25-40b5-a8fa-2af2530846f4-kube-api-access-ftn2f" (OuterVolumeSpecName: "kube-api-access-ftn2f") pod "e08f18e5-cd25-40b5-a8fa-2af2530846f4" (UID: "e08f18e5-cd25-40b5-a8fa-2af2530846f4"). InnerVolumeSpecName "kube-api-access-ftn2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:50:31 crc kubenswrapper[4789]: I1216 08:50:31.996184 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e08f18e5-cd25-40b5-a8fa-2af2530846f4" (UID: "e08f18e5-cd25-40b5-a8fa-2af2530846f4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.018939 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-inventory" (OuterVolumeSpecName: "inventory") pod "e08f18e5-cd25-40b5-a8fa-2af2530846f4" (UID: "e08f18e5-cd25-40b5-a8fa-2af2530846f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:50:32 crc kubenswrapper[4789]: E1216 08:50:32.035143 4789 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key podName:e08f18e5-cd25-40b5-a8fa-2af2530846f4 nodeName:}" failed. No retries permitted until 2025-12-16 08:50:32.535097714 +0000 UTC m=+7170.796985343 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key") pod "e08f18e5-cd25-40b5-a8fa-2af2530846f4" (UID: "e08f18e5-cd25-40b5-a8fa-2af2530846f4") : error deleting /var/lib/kubelet/pods/e08f18e5-cd25-40b5-a8fa-2af2530846f4/volume-subpaths: remove /var/lib/kubelet/pods/e08f18e5-cd25-40b5-a8fa-2af2530846f4/volume-subpaths: no such file or directory Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.035609 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e08f18e5-cd25-40b5-a8fa-2af2530846f4" (UID: "e08f18e5-cd25-40b5-a8fa-2af2530846f4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.089970 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftn2f\" (UniqueName: \"kubernetes.io/projected/e08f18e5-cd25-40b5-a8fa-2af2530846f4-kube-api-access-ftn2f\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.090005 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.090021 4789 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.090031 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.090042 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.267404 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" event={"ID":"e08f18e5-cd25-40b5-a8fa-2af2530846f4","Type":"ContainerDied","Data":"b6af15d7134b25542814a0e5ab4a8ea918364f2ec85f351a66e41770e13b6627"} Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.267441 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6af15d7134b25542814a0e5ab4a8ea918364f2ec85f351a66e41770e13b6627" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.267444 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-ppzs5" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.372218 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-n2xkz"] Dec 16 08:50:32 crc kubenswrapper[4789]: E1216 08:50:32.372767 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08f18e5-cd25-40b5-a8fa-2af2530846f4" containerName="ovn-openstack-openstack-cell1" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.372791 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08f18e5-cd25-40b5-a8fa-2af2530846f4" containerName="ovn-openstack-openstack-cell1" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.373048 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08f18e5-cd25-40b5-a8fa-2af2530846f4" containerName="ovn-openstack-openstack-cell1" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.377527 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.382178 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.382334 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.398486 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-n2xkz"] Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.497901 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.498333 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.498441 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.498547 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.498644 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktwfr\" (UniqueName: \"kubernetes.io/projected/74959a3a-150a-4441-a8d3-b717d73415ca-kube-api-access-ktwfr\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.498707 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.498779 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.600930 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key\") pod \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\" (UID: \"e08f18e5-cd25-40b5-a8fa-2af2530846f4\") " Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.601352 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.601401 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.601461 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktwfr\" (UniqueName: \"kubernetes.io/projected/74959a3a-150a-4441-a8d3-b717d73415ca-kube-api-access-ktwfr\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.601507 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.601559 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.601631 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.601700 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.604836 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e08f18e5-cd25-40b5-a8fa-2af2530846f4" (UID: "e08f18e5-cd25-40b5-a8fa-2af2530846f4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.605884 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.606148 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.606409 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.606677 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.607732 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.609630 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.619324 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktwfr\" (UniqueName: \"kubernetes.io/projected/74959a3a-150a-4441-a8d3-b717d73415ca-kube-api-access-ktwfr\") pod \"neutron-metadata-openstack-openstack-cell1-n2xkz\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.703533 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e08f18e5-cd25-40b5-a8fa-2af2530846f4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:50:32 crc kubenswrapper[4789]: I1216 08:50:32.707561 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:50:33 crc kubenswrapper[4789]: I1216 08:50:33.262997 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-n2xkz"] Dec 16 08:50:33 crc kubenswrapper[4789]: I1216 08:50:33.296411 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" event={"ID":"74959a3a-150a-4441-a8d3-b717d73415ca","Type":"ContainerStarted","Data":"d1bf4765216086b4a0582daeb0e20ed90924eebe850e02000942a6959cf5c290"} Dec 16 08:50:34 crc kubenswrapper[4789]: I1216 08:50:34.310079 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" event={"ID":"74959a3a-150a-4441-a8d3-b717d73415ca","Type":"ContainerStarted","Data":"5493a63dc9b58f0adce81b472c08a0b3ac3e3c35a07a542475e8c6e52d8a40dc"} Dec 16 08:50:34 crc kubenswrapper[4789]: I1216 08:50:34.334882 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" podStartSLOduration=1.6424690069999999 podStartE2EDuration="2.33486161s" podCreationTimestamp="2025-12-16 08:50:32 +0000 UTC" firstStartedPulling="2025-12-16 08:50:33.26709603 +0000 UTC m=+7171.528983659" lastFinishedPulling="2025-12-16 08:50:33.959488633 +0000 UTC m=+7172.221376262" observedRunningTime="2025-12-16 08:50:34.330633227 +0000 UTC m=+7172.592520856" watchObservedRunningTime="2025-12-16 08:50:34.33486161 +0000 UTC m=+7172.596749239" Dec 16 08:50:51 crc kubenswrapper[4789]: I1216 08:50:51.927560 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:50:51 crc kubenswrapper[4789]: I1216 08:50:51.928190 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:51:21 crc kubenswrapper[4789]: I1216 08:51:21.927650 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:51:21 crc kubenswrapper[4789]: I1216 08:51:21.928243 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:51:28 crc kubenswrapper[4789]: I1216 08:51:28.786762 4789 generic.go:334] "Generic (PLEG): container finished" podID="74959a3a-150a-4441-a8d3-b717d73415ca" containerID="5493a63dc9b58f0adce81b472c08a0b3ac3e3c35a07a542475e8c6e52d8a40dc" exitCode=0 Dec 16 08:51:28 crc kubenswrapper[4789]: I1216 08:51:28.787481 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" event={"ID":"74959a3a-150a-4441-a8d3-b717d73415ca","Type":"ContainerDied","Data":"5493a63dc9b58f0adce81b472c08a0b3ac3e3c35a07a542475e8c6e52d8a40dc"} Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.253499 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.343608 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-metadata-combined-ca-bundle\") pod \"74959a3a-150a-4441-a8d3-b717d73415ca\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.344113 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktwfr\" (UniqueName: \"kubernetes.io/projected/74959a3a-150a-4441-a8d3-b717d73415ca-kube-api-access-ktwfr\") pod \"74959a3a-150a-4441-a8d3-b717d73415ca\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.344266 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ceph\") pod \"74959a3a-150a-4441-a8d3-b717d73415ca\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.344396 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ssh-key\") pod \"74959a3a-150a-4441-a8d3-b717d73415ca\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.344609 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"74959a3a-150a-4441-a8d3-b717d73415ca\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.344729 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-inventory\") pod \"74959a3a-150a-4441-a8d3-b717d73415ca\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.344902 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-nova-metadata-neutron-config-0\") pod \"74959a3a-150a-4441-a8d3-b717d73415ca\" (UID: \"74959a3a-150a-4441-a8d3-b717d73415ca\") " Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.349631 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "74959a3a-150a-4441-a8d3-b717d73415ca" (UID: "74959a3a-150a-4441-a8d3-b717d73415ca"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.352040 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ceph" (OuterVolumeSpecName: "ceph") pod "74959a3a-150a-4441-a8d3-b717d73415ca" (UID: "74959a3a-150a-4441-a8d3-b717d73415ca"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.353466 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74959a3a-150a-4441-a8d3-b717d73415ca-kube-api-access-ktwfr" (OuterVolumeSpecName: "kube-api-access-ktwfr") pod "74959a3a-150a-4441-a8d3-b717d73415ca" (UID: "74959a3a-150a-4441-a8d3-b717d73415ca"). InnerVolumeSpecName "kube-api-access-ktwfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.376590 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-inventory" (OuterVolumeSpecName: "inventory") pod "74959a3a-150a-4441-a8d3-b717d73415ca" (UID: "74959a3a-150a-4441-a8d3-b717d73415ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.384315 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "74959a3a-150a-4441-a8d3-b717d73415ca" (UID: "74959a3a-150a-4441-a8d3-b717d73415ca"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.385623 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "74959a3a-150a-4441-a8d3-b717d73415ca" (UID: "74959a3a-150a-4441-a8d3-b717d73415ca"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.397767 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74959a3a-150a-4441-a8d3-b717d73415ca" (UID: "74959a3a-150a-4441-a8d3-b717d73415ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.447781 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.447817 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.447828 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.447840 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.447849 4789 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.447858 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74959a3a-150a-4441-a8d3-b717d73415ca-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.447868 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktwfr\" (UniqueName: \"kubernetes.io/projected/74959a3a-150a-4441-a8d3-b717d73415ca-kube-api-access-ktwfr\") on node \"crc\" DevicePath \"\"" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.807345 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" event={"ID":"74959a3a-150a-4441-a8d3-b717d73415ca","Type":"ContainerDied","Data":"d1bf4765216086b4a0582daeb0e20ed90924eebe850e02000942a6959cf5c290"} Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.807389 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1bf4765216086b4a0582daeb0e20ed90924eebe850e02000942a6959cf5c290" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.807450 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-n2xkz" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.919461 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4lrbh"] Dec 16 08:51:30 crc kubenswrapper[4789]: E1216 08:51:30.919939 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74959a3a-150a-4441-a8d3-b717d73415ca" containerName="neutron-metadata-openstack-openstack-cell1" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.919961 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="74959a3a-150a-4441-a8d3-b717d73415ca" containerName="neutron-metadata-openstack-openstack-cell1" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.920194 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="74959a3a-150a-4441-a8d3-b717d73415ca" containerName="neutron-metadata-openstack-openstack-cell1" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.920908 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.923088 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.924147 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.924213 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.927567 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.932327 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:51:30 crc kubenswrapper[4789]: I1216 08:51:30.933879 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4lrbh"] Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.060709 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-inventory\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.060778 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ssh-key\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.060885 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.060985 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ceph\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.061160 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.061225 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktx8\" (UniqueName: \"kubernetes.io/projected/4cb7847a-6a82-44b8-a1da-6583cb76efc8-kube-api-access-9ktx8\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.164550 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-inventory\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.164653 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ssh-key\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.164700 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.164738 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ceph\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.164844 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.164903 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ktx8\" (UniqueName: \"kubernetes.io/projected/4cb7847a-6a82-44b8-a1da-6583cb76efc8-kube-api-access-9ktx8\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.169359 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ceph\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.169596 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.169743 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ssh-key\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.170673 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-inventory\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.170991 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.184674 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ktx8\" (UniqueName: \"kubernetes.io/projected/4cb7847a-6a82-44b8-a1da-6583cb76efc8-kube-api-access-9ktx8\") pod \"libvirt-openstack-openstack-cell1-4lrbh\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.242084 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.800199 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4lrbh"] Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.805093 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:51:31 crc kubenswrapper[4789]: I1216 08:51:31.817566 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" event={"ID":"4cb7847a-6a82-44b8-a1da-6583cb76efc8","Type":"ContainerStarted","Data":"4a070f5635795f600321d99a0d7d6c2978acdb6079b120c7db6f8e1cc05716b1"} Dec 16 08:51:32 crc kubenswrapper[4789]: I1216 08:51:32.841855 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" event={"ID":"4cb7847a-6a82-44b8-a1da-6583cb76efc8","Type":"ContainerStarted","Data":"79d11b75d1faf21b4e086dac3151f4979572c80631fa2352e5fb16054a9eb061"} Dec 16 08:51:32 crc kubenswrapper[4789]: I1216 08:51:32.877204 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" podStartSLOduration=2.184177083 podStartE2EDuration="2.87718234s" podCreationTimestamp="2025-12-16 08:51:30 +0000 UTC" firstStartedPulling="2025-12-16 08:51:31.804771117 +0000 UTC m=+7230.066658746" lastFinishedPulling="2025-12-16 08:51:32.497776374 +0000 UTC m=+7230.759664003" observedRunningTime="2025-12-16 08:51:32.869671346 +0000 UTC m=+7231.131558985" watchObservedRunningTime="2025-12-16 08:51:32.87718234 +0000 UTC m=+7231.139069969" Dec 16 08:51:51 crc kubenswrapper[4789]: I1216 08:51:51.928062 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:51:51 crc kubenswrapper[4789]: I1216 08:51:51.928613 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:51:51 crc kubenswrapper[4789]: I1216 08:51:51.928674 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:51:51 crc kubenswrapper[4789]: I1216 08:51:51.930070 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77fffb00e7e434d3555e3b3538fc62383b6e955440c7beef42ce88d64343310d"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:51:51 crc kubenswrapper[4789]: I1216 08:51:51.930169 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://77fffb00e7e434d3555e3b3538fc62383b6e955440c7beef42ce88d64343310d" gracePeriod=600 Dec 16 08:51:53 crc kubenswrapper[4789]: I1216 08:51:53.041754 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="77fffb00e7e434d3555e3b3538fc62383b6e955440c7beef42ce88d64343310d" exitCode=0 Dec 16 08:51:53 crc kubenswrapper[4789]: I1216 08:51:53.041836 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"77fffb00e7e434d3555e3b3538fc62383b6e955440c7beef42ce88d64343310d"} Dec 16 08:51:53 crc kubenswrapper[4789]: I1216 08:51:53.042522 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e"} Dec 16 08:51:53 crc kubenswrapper[4789]: I1216 08:51:53.042550 4789 scope.go:117] "RemoveContainer" containerID="c7625c7e694e311389e99b8eb77122c2d14245781eed6dcb243c61cbb7a880c7" Dec 16 08:52:56 crc kubenswrapper[4789]: I1216 08:52:56.993022 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2dqwl"] Dec 16 08:52:56 crc kubenswrapper[4789]: I1216 08:52:56.996747 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.009007 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dqwl"] Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.110789 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-utilities\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.111150 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-catalog-content\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.111233 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58vct\" (UniqueName: \"kubernetes.io/projected/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-kube-api-access-58vct\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.213188 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-utilities\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.213334 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-catalog-content\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.213398 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58vct\" (UniqueName: \"kubernetes.io/projected/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-kube-api-access-58vct\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.214128 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-catalog-content\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.214698 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-utilities\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.240712 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58vct\" (UniqueName: \"kubernetes.io/projected/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-kube-api-access-58vct\") pod \"community-operators-2dqwl\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.327656 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:52:57 crc kubenswrapper[4789]: I1216 08:52:57.892546 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dqwl"] Dec 16 08:52:58 crc kubenswrapper[4789]: I1216 08:52:58.673540 4789 generic.go:334] "Generic (PLEG): container finished" podID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerID="5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32" exitCode=0 Dec 16 08:52:58 crc kubenswrapper[4789]: I1216 08:52:58.673893 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dqwl" event={"ID":"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5","Type":"ContainerDied","Data":"5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32"} Dec 16 08:52:58 crc kubenswrapper[4789]: I1216 08:52:58.674056 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dqwl" event={"ID":"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5","Type":"ContainerStarted","Data":"d6b202f8194ef329e1db4f953b7fd9794146a98f125c357d9015991eaceeca73"} Dec 16 08:52:59 crc kubenswrapper[4789]: I1216 08:52:59.689244 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dqwl" event={"ID":"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5","Type":"ContainerStarted","Data":"7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6"} Dec 16 08:53:00 crc kubenswrapper[4789]: I1216 08:53:00.700166 4789 generic.go:334] "Generic (PLEG): container finished" podID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerID="7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6" exitCode=0 Dec 16 08:53:00 crc kubenswrapper[4789]: I1216 08:53:00.700264 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dqwl" event={"ID":"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5","Type":"ContainerDied","Data":"7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6"} Dec 16 08:53:01 crc kubenswrapper[4789]: I1216 08:53:01.712258 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dqwl" event={"ID":"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5","Type":"ContainerStarted","Data":"1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3"} Dec 16 08:53:01 crc kubenswrapper[4789]: I1216 08:53:01.736856 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2dqwl" podStartSLOduration=3.149207917 podStartE2EDuration="5.736835971s" podCreationTimestamp="2025-12-16 08:52:56 +0000 UTC" firstStartedPulling="2025-12-16 08:52:58.676239026 +0000 UTC m=+7316.938126655" lastFinishedPulling="2025-12-16 08:53:01.26386708 +0000 UTC m=+7319.525754709" observedRunningTime="2025-12-16 08:53:01.728667712 +0000 UTC m=+7319.990555341" watchObservedRunningTime="2025-12-16 08:53:01.736835971 +0000 UTC m=+7319.998723600" Dec 16 08:53:07 crc kubenswrapper[4789]: I1216 08:53:07.328889 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:53:07 crc kubenswrapper[4789]: I1216 08:53:07.330059 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:53:07 crc kubenswrapper[4789]: I1216 08:53:07.372869 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:53:07 crc kubenswrapper[4789]: I1216 08:53:07.809683 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:53:07 crc kubenswrapper[4789]: I1216 08:53:07.859493 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dqwl"] Dec 16 08:53:09 crc kubenswrapper[4789]: I1216 08:53:09.786159 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2dqwl" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="registry-server" containerID="cri-o://1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3" gracePeriod=2 Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.257129 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.333742 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-catalog-content\") pod \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.333841 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-utilities\") pod \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.333973 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58vct\" (UniqueName: \"kubernetes.io/projected/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-kube-api-access-58vct\") pod \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\" (UID: \"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5\") " Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.334976 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-utilities" (OuterVolumeSpecName: "utilities") pod "ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" (UID: "ace93b61-ef85-444c-aa4c-bb6ab7bf45e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.339807 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-kube-api-access-58vct" (OuterVolumeSpecName: "kube-api-access-58vct") pod "ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" (UID: "ace93b61-ef85-444c-aa4c-bb6ab7bf45e5"). InnerVolumeSpecName "kube-api-access-58vct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.387178 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" (UID: "ace93b61-ef85-444c-aa4c-bb6ab7bf45e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.436781 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58vct\" (UniqueName: \"kubernetes.io/projected/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-kube-api-access-58vct\") on node \"crc\" DevicePath \"\"" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.436813 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.436823 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.805782 4789 generic.go:334] "Generic (PLEG): container finished" podID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerID="1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3" exitCode=0 Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.805875 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dqwl" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.805873 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dqwl" event={"ID":"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5","Type":"ContainerDied","Data":"1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3"} Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.806225 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dqwl" event={"ID":"ace93b61-ef85-444c-aa4c-bb6ab7bf45e5","Type":"ContainerDied","Data":"d6b202f8194ef329e1db4f953b7fd9794146a98f125c357d9015991eaceeca73"} Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.806244 4789 scope.go:117] "RemoveContainer" containerID="1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.829784 4789 scope.go:117] "RemoveContainer" containerID="7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.840891 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dqwl"] Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.849826 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2dqwl"] Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.867252 4789 scope.go:117] "RemoveContainer" containerID="5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.909103 4789 scope.go:117] "RemoveContainer" containerID="1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3" Dec 16 08:53:10 crc kubenswrapper[4789]: E1216 08:53:10.909622 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3\": container with ID starting with 1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3 not found: ID does not exist" containerID="1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.909718 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3"} err="failed to get container status \"1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3\": rpc error: code = NotFound desc = could not find container \"1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3\": container with ID starting with 1f5db2091088364b3a7b1c3c62c390a796ed9d8c97c6028af545d9aae40aaab3 not found: ID does not exist" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.909805 4789 scope.go:117] "RemoveContainer" containerID="7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6" Dec 16 08:53:10 crc kubenswrapper[4789]: E1216 08:53:10.910357 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6\": container with ID starting with 7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6 not found: ID does not exist" containerID="7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.910402 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6"} err="failed to get container status \"7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6\": rpc error: code = NotFound desc = could not find container \"7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6\": container with ID starting with 7406cdad92490c690e690c96358a7affd5daeb50d2c520eda6d51fa0ee825ac6 not found: ID does not exist" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.910430 4789 scope.go:117] "RemoveContainer" containerID="5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32" Dec 16 08:53:10 crc kubenswrapper[4789]: E1216 08:53:10.910794 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32\": container with ID starting with 5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32 not found: ID does not exist" containerID="5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32" Dec 16 08:53:10 crc kubenswrapper[4789]: I1216 08:53:10.910816 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32"} err="failed to get container status \"5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32\": rpc error: code = NotFound desc = could not find container \"5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32\": container with ID starting with 5bc4acf7078168ce61d0d4ffa9efd274a5ad74b55dae64bac46d2e82e4dcac32 not found: ID does not exist" Dec 16 08:53:12 crc kubenswrapper[4789]: I1216 08:53:12.118429 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" path="/var/lib/kubelet/pods/ace93b61-ef85-444c-aa4c-bb6ab7bf45e5/volumes" Dec 16 08:54:21 crc kubenswrapper[4789]: I1216 08:54:21.927885 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:54:21 crc kubenswrapper[4789]: I1216 08:54:21.928465 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:54:51 crc kubenswrapper[4789]: I1216 08:54:51.927979 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:54:51 crc kubenswrapper[4789]: I1216 08:54:51.928404 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.378494 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-klktc"] Dec 16 08:54:53 crc kubenswrapper[4789]: E1216 08:54:53.380023 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="registry-server" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.380145 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="registry-server" Dec 16 08:54:53 crc kubenswrapper[4789]: E1216 08:54:53.380252 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="extract-content" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.380324 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="extract-content" Dec 16 08:54:53 crc kubenswrapper[4789]: E1216 08:54:53.380418 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="extract-utilities" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.380497 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="extract-utilities" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.380830 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace93b61-ef85-444c-aa4c-bb6ab7bf45e5" containerName="registry-server" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.382603 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.392781 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-klktc"] Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.582951 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z568q\" (UniqueName: \"kubernetes.io/projected/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-kube-api-access-z568q\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.583045 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-catalog-content\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.583340 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-utilities\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.699576 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z568q\" (UniqueName: \"kubernetes.io/projected/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-kube-api-access-z568q\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.699701 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-catalog-content\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.699811 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-utilities\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.700777 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-utilities\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.701054 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-catalog-content\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:53 crc kubenswrapper[4789]: I1216 08:54:53.740339 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z568q\" (UniqueName: \"kubernetes.io/projected/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-kube-api-access-z568q\") pod \"redhat-operators-klktc\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:54 crc kubenswrapper[4789]: I1216 08:54:54.031180 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:54:54 crc kubenswrapper[4789]: I1216 08:54:54.589887 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-klktc"] Dec 16 08:54:54 crc kubenswrapper[4789]: I1216 08:54:54.778272 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klktc" event={"ID":"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7","Type":"ContainerStarted","Data":"49236ab55847c43e99dccc68b91d707c391653cc73a2caaa92eadcacb3ff5265"} Dec 16 08:54:55 crc kubenswrapper[4789]: I1216 08:54:55.787611 4789 generic.go:334] "Generic (PLEG): container finished" podID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerID="3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459" exitCode=0 Dec 16 08:54:55 crc kubenswrapper[4789]: I1216 08:54:55.787862 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klktc" event={"ID":"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7","Type":"ContainerDied","Data":"3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459"} Dec 16 08:54:56 crc kubenswrapper[4789]: I1216 08:54:56.800251 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klktc" event={"ID":"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7","Type":"ContainerStarted","Data":"cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06"} Dec 16 08:55:00 crc kubenswrapper[4789]: I1216 08:55:00.835929 4789 generic.go:334] "Generic (PLEG): container finished" podID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerID="cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06" exitCode=0 Dec 16 08:55:00 crc kubenswrapper[4789]: I1216 08:55:00.836099 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klktc" event={"ID":"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7","Type":"ContainerDied","Data":"cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06"} Dec 16 08:55:01 crc kubenswrapper[4789]: I1216 08:55:01.847875 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klktc" event={"ID":"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7","Type":"ContainerStarted","Data":"38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b"} Dec 16 08:55:01 crc kubenswrapper[4789]: I1216 08:55:01.869063 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-klktc" podStartSLOduration=3.324558705 podStartE2EDuration="8.869040952s" podCreationTimestamp="2025-12-16 08:54:53 +0000 UTC" firstStartedPulling="2025-12-16 08:54:55.789533629 +0000 UTC m=+7434.051421258" lastFinishedPulling="2025-12-16 08:55:01.334015876 +0000 UTC m=+7439.595903505" observedRunningTime="2025-12-16 08:55:01.867563185 +0000 UTC m=+7440.129450824" watchObservedRunningTime="2025-12-16 08:55:01.869040952 +0000 UTC m=+7440.130928591" Dec 16 08:55:04 crc kubenswrapper[4789]: I1216 08:55:04.031841 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:55:04 crc kubenswrapper[4789]: I1216 08:55:04.033301 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:55:05 crc kubenswrapper[4789]: I1216 08:55:05.077861 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-klktc" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="registry-server" probeResult="failure" output=< Dec 16 08:55:05 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 08:55:05 crc kubenswrapper[4789]: > Dec 16 08:55:14 crc kubenswrapper[4789]: I1216 08:55:14.075888 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:55:14 crc kubenswrapper[4789]: I1216 08:55:14.130829 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:55:14 crc kubenswrapper[4789]: I1216 08:55:14.315693 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-klktc"] Dec 16 08:55:15 crc kubenswrapper[4789]: I1216 08:55:15.969857 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-klktc" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="registry-server" containerID="cri-o://38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b" gracePeriod=2 Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.402219 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.467075 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z568q\" (UniqueName: \"kubernetes.io/projected/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-kube-api-access-z568q\") pod \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.467224 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-utilities\") pod \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.467272 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-catalog-content\") pod \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\" (UID: \"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7\") " Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.478300 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-kube-api-access-z568q" (OuterVolumeSpecName: "kube-api-access-z568q") pod "33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" (UID: "33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7"). InnerVolumeSpecName "kube-api-access-z568q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.478858 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-utilities" (OuterVolumeSpecName: "utilities") pod "33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" (UID: "33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.568006 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" (UID: "33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.569437 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.569471 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z568q\" (UniqueName: \"kubernetes.io/projected/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-kube-api-access-z568q\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.569482 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.994002 4789 generic.go:334] "Generic (PLEG): container finished" podID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerID="38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b" exitCode=0 Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.994065 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klktc" event={"ID":"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7","Type":"ContainerDied","Data":"38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b"} Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.994148 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-klktc" event={"ID":"33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7","Type":"ContainerDied","Data":"49236ab55847c43e99dccc68b91d707c391653cc73a2caaa92eadcacb3ff5265"} Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.994168 4789 scope.go:117] "RemoveContainer" containerID="38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b" Dec 16 08:55:16 crc kubenswrapper[4789]: I1216 08:55:16.994089 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-klktc" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.044170 4789 scope.go:117] "RemoveContainer" containerID="cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.051783 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-klktc"] Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.061612 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-klktc"] Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.075999 4789 scope.go:117] "RemoveContainer" containerID="3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.121298 4789 scope.go:117] "RemoveContainer" containerID="38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b" Dec 16 08:55:17 crc kubenswrapper[4789]: E1216 08:55:17.121716 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b\": container with ID starting with 38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b not found: ID does not exist" containerID="38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.121745 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b"} err="failed to get container status \"38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b\": rpc error: code = NotFound desc = could not find container \"38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b\": container with ID starting with 38bbbd836ff069f5ff91a5a6f247b790b3b86329c5c0bfa447ffb4b01534d21b not found: ID does not exist" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.121764 4789 scope.go:117] "RemoveContainer" containerID="cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06" Dec 16 08:55:17 crc kubenswrapper[4789]: E1216 08:55:17.122245 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06\": container with ID starting with cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06 not found: ID does not exist" containerID="cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.122269 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06"} err="failed to get container status \"cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06\": rpc error: code = NotFound desc = could not find container \"cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06\": container with ID starting with cc64421d5b2304ea087aec2a2c609ce90a9d2ea56704cba44ab2170236663b06 not found: ID does not exist" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.122284 4789 scope.go:117] "RemoveContainer" containerID="3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459" Dec 16 08:55:17 crc kubenswrapper[4789]: E1216 08:55:17.122482 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459\": container with ID starting with 3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459 not found: ID does not exist" containerID="3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459" Dec 16 08:55:17 crc kubenswrapper[4789]: I1216 08:55:17.122503 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459"} err="failed to get container status \"3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459\": rpc error: code = NotFound desc = could not find container \"3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459\": container with ID starting with 3b3c22f149f6fba95000fddafbbceef8004bd82ee50e1b24e30d60c54525c459 not found: ID does not exist" Dec 16 08:55:18 crc kubenswrapper[4789]: I1216 08:55:18.115383 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" path="/var/lib/kubelet/pods/33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7/volumes" Dec 16 08:55:21 crc kubenswrapper[4789]: I1216 08:55:21.927639 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 08:55:21 crc kubenswrapper[4789]: I1216 08:55:21.928005 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 08:55:21 crc kubenswrapper[4789]: I1216 08:55:21.928046 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 08:55:21 crc kubenswrapper[4789]: I1216 08:55:21.928801 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 08:55:21 crc kubenswrapper[4789]: I1216 08:55:21.928853 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" gracePeriod=600 Dec 16 08:55:22 crc kubenswrapper[4789]: E1216 08:55:22.048032 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:55:22 crc kubenswrapper[4789]: I1216 08:55:22.061325 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" exitCode=0 Dec 16 08:55:22 crc kubenswrapper[4789]: I1216 08:55:22.061363 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e"} Dec 16 08:55:22 crc kubenswrapper[4789]: I1216 08:55:22.061394 4789 scope.go:117] "RemoveContainer" containerID="77fffb00e7e434d3555e3b3538fc62383b6e955440c7beef42ce88d64343310d" Dec 16 08:55:22 crc kubenswrapper[4789]: I1216 08:55:22.062146 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:55:22 crc kubenswrapper[4789]: E1216 08:55:22.062396 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:55:33 crc kubenswrapper[4789]: I1216 08:55:33.106114 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:55:33 crc kubenswrapper[4789]: E1216 08:55:33.107297 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:55:46 crc kubenswrapper[4789]: I1216 08:55:46.104681 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:55:46 crc kubenswrapper[4789]: E1216 08:55:46.105430 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:55:57 crc kubenswrapper[4789]: I1216 08:55:57.106581 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:55:57 crc kubenswrapper[4789]: E1216 08:55:57.107454 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:56:02 crc kubenswrapper[4789]: I1216 08:56:02.447965 4789 generic.go:334] "Generic (PLEG): container finished" podID="4cb7847a-6a82-44b8-a1da-6583cb76efc8" containerID="79d11b75d1faf21b4e086dac3151f4979572c80631fa2352e5fb16054a9eb061" exitCode=0 Dec 16 08:56:02 crc kubenswrapper[4789]: I1216 08:56:02.448037 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" event={"ID":"4cb7847a-6a82-44b8-a1da-6583cb76efc8","Type":"ContainerDied","Data":"79d11b75d1faf21b4e086dac3151f4979572c80631fa2352e5fb16054a9eb061"} Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.871131 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.939093 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ssh-key\") pod \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.939149 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-inventory\") pod \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.939184 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ktx8\" (UniqueName: \"kubernetes.io/projected/4cb7847a-6a82-44b8-a1da-6583cb76efc8-kube-api-access-9ktx8\") pod \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.939265 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-combined-ca-bundle\") pod \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.939316 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-secret-0\") pod \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.939339 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ceph\") pod \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\" (UID: \"4cb7847a-6a82-44b8-a1da-6583cb76efc8\") " Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.946038 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4cb7847a-6a82-44b8-a1da-6583cb76efc8" (UID: "4cb7847a-6a82-44b8-a1da-6583cb76efc8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.946426 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ceph" (OuterVolumeSpecName: "ceph") pod "4cb7847a-6a82-44b8-a1da-6583cb76efc8" (UID: "4cb7847a-6a82-44b8-a1da-6583cb76efc8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.947153 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb7847a-6a82-44b8-a1da-6583cb76efc8-kube-api-access-9ktx8" (OuterVolumeSpecName: "kube-api-access-9ktx8") pod "4cb7847a-6a82-44b8-a1da-6583cb76efc8" (UID: "4cb7847a-6a82-44b8-a1da-6583cb76efc8"). InnerVolumeSpecName "kube-api-access-9ktx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.970359 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "4cb7847a-6a82-44b8-a1da-6583cb76efc8" (UID: "4cb7847a-6a82-44b8-a1da-6583cb76efc8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.972439 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-inventory" (OuterVolumeSpecName: "inventory") pod "4cb7847a-6a82-44b8-a1da-6583cb76efc8" (UID: "4cb7847a-6a82-44b8-a1da-6583cb76efc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:56:03 crc kubenswrapper[4789]: I1216 08:56:03.973875 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4cb7847a-6a82-44b8-a1da-6583cb76efc8" (UID: "4cb7847a-6a82-44b8-a1da-6583cb76efc8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.041696 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.041731 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.041765 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ktx8\" (UniqueName: \"kubernetes.io/projected/4cb7847a-6a82-44b8-a1da-6583cb76efc8-kube-api-access-9ktx8\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.041778 4789 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.041789 4789 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.041798 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cb7847a-6a82-44b8-a1da-6583cb76efc8-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.466405 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" event={"ID":"4cb7847a-6a82-44b8-a1da-6583cb76efc8","Type":"ContainerDied","Data":"4a070f5635795f600321d99a0d7d6c2978acdb6079b120c7db6f8e1cc05716b1"} Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.466448 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a070f5635795f600321d99a0d7d6c2978acdb6079b120c7db6f8e1cc05716b1" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.466451 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4lrbh" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.555737 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-c7bll"] Dec 16 08:56:04 crc kubenswrapper[4789]: E1216 08:56:04.556241 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="registry-server" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.556259 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="registry-server" Dec 16 08:56:04 crc kubenswrapper[4789]: E1216 08:56:04.556282 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="extract-utilities" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.556288 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="extract-utilities" Dec 16 08:56:04 crc kubenswrapper[4789]: E1216 08:56:04.556304 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb7847a-6a82-44b8-a1da-6583cb76efc8" containerName="libvirt-openstack-openstack-cell1" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.556312 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb7847a-6a82-44b8-a1da-6583cb76efc8" containerName="libvirt-openstack-openstack-cell1" Dec 16 08:56:04 crc kubenswrapper[4789]: E1216 08:56:04.556327 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="extract-content" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.556334 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="extract-content" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.556598 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb7847a-6a82-44b8-a1da-6583cb76efc8" containerName="libvirt-openstack-openstack-cell1" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.556618 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c1e26d-ec4b-4ee3-b3ec-e2549e91c7b7" containerName="registry-server" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.557410 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.565800 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.565938 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.566025 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.566167 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.565800 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.566546 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.566668 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.572296 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-c7bll"] Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.653950 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.653998 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654050 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654100 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654125 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654191 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654268 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654335 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654405 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654458 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ceph\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.654507 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rgd\" (UniqueName: \"kubernetes.io/projected/f2844d2e-202c-470b-9bb9-cb0506134f3c-kube-api-access-t9rgd\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759357 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759448 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759507 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ceph\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759565 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rgd\" (UniqueName: \"kubernetes.io/projected/f2844d2e-202c-470b-9bb9-cb0506134f3c-kube-api-access-t9rgd\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759592 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759613 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759652 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759687 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759710 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759728 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.759750 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.760971 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.764330 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.764481 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.764741 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.764845 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.765335 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.765526 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.765932 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.773016 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.782677 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ceph\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.787433 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rgd\" (UniqueName: \"kubernetes.io/projected/f2844d2e-202c-470b-9bb9-cb0506134f3c-kube-api-access-t9rgd\") pod \"nova-cell1-openstack-openstack-cell1-c7bll\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:04 crc kubenswrapper[4789]: I1216 08:56:04.874593 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:56:05 crc kubenswrapper[4789]: I1216 08:56:05.452058 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-c7bll"] Dec 16 08:56:05 crc kubenswrapper[4789]: I1216 08:56:05.483023 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" event={"ID":"f2844d2e-202c-470b-9bb9-cb0506134f3c","Type":"ContainerStarted","Data":"411ef355a9b2ec7c41513b20238b2a7f7eb1313f8aaebd74aaaac965ab66f5e9"} Dec 16 08:56:06 crc kubenswrapper[4789]: I1216 08:56:06.511754 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" event={"ID":"f2844d2e-202c-470b-9bb9-cb0506134f3c","Type":"ContainerStarted","Data":"07483dd90372799568eec12dad54c395ad17af68731f7d352a451276a4eac780"} Dec 16 08:56:06 crc kubenswrapper[4789]: I1216 08:56:06.537147 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" podStartSLOduration=2.062683783 podStartE2EDuration="2.537127481s" podCreationTimestamp="2025-12-16 08:56:04 +0000 UTC" firstStartedPulling="2025-12-16 08:56:05.460428047 +0000 UTC m=+7503.722315676" lastFinishedPulling="2025-12-16 08:56:05.934871745 +0000 UTC m=+7504.196759374" observedRunningTime="2025-12-16 08:56:06.52927804 +0000 UTC m=+7504.791165689" watchObservedRunningTime="2025-12-16 08:56:06.537127481 +0000 UTC m=+7504.799015110" Dec 16 08:56:09 crc kubenswrapper[4789]: I1216 08:56:09.105019 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:56:09 crc kubenswrapper[4789]: E1216 08:56:09.105801 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:56:20 crc kubenswrapper[4789]: I1216 08:56:20.105714 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:56:20 crc kubenswrapper[4789]: E1216 08:56:20.106724 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:56:32 crc kubenswrapper[4789]: I1216 08:56:32.112977 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:56:32 crc kubenswrapper[4789]: E1216 08:56:32.113905 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:56:47 crc kubenswrapper[4789]: I1216 08:56:47.104602 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:56:47 crc kubenswrapper[4789]: E1216 08:56:47.105378 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:56:59 crc kubenswrapper[4789]: I1216 08:56:59.105367 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:56:59 crc kubenswrapper[4789]: E1216 08:56:59.106132 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:57:12 crc kubenswrapper[4789]: I1216 08:57:12.114704 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:57:12 crc kubenswrapper[4789]: E1216 08:57:12.115603 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:57:26 crc kubenswrapper[4789]: I1216 08:57:26.104953 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:57:26 crc kubenswrapper[4789]: E1216 08:57:26.105850 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:57:39 crc kubenswrapper[4789]: I1216 08:57:39.105753 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:57:39 crc kubenswrapper[4789]: E1216 08:57:39.106570 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:57:50 crc kubenswrapper[4789]: I1216 08:57:50.105445 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:57:50 crc kubenswrapper[4789]: E1216 08:57:50.106709 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:58:04 crc kubenswrapper[4789]: I1216 08:58:04.104687 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:58:04 crc kubenswrapper[4789]: E1216 08:58:04.105498 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.643004 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gj7p7"] Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.645961 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.656978 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj7p7"] Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.741477 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-utilities\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.741537 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvrq\" (UniqueName: \"kubernetes.io/projected/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-kube-api-access-kxvrq\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.741626 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-catalog-content\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.844797 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-catalog-content\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.845297 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-utilities\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.845372 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvrq\" (UniqueName: \"kubernetes.io/projected/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-kube-api-access-kxvrq\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.845504 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-catalog-content\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.845835 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-utilities\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.865422 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvrq\" (UniqueName: \"kubernetes.io/projected/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-kube-api-access-kxvrq\") pod \"redhat-marketplace-gj7p7\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:12 crc kubenswrapper[4789]: I1216 08:58:12.975678 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:13 crc kubenswrapper[4789]: I1216 08:58:13.476994 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj7p7"] Dec 16 08:58:13 crc kubenswrapper[4789]: I1216 08:58:13.663064 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj7p7" event={"ID":"cfa7ec84-5458-47a2-b8d4-6ea6f670697c","Type":"ContainerStarted","Data":"f7e32aa60effc5771ce710e2288d85a60aabacdc80f81474a3e93e325a0d61d9"} Dec 16 08:58:14 crc kubenswrapper[4789]: I1216 08:58:14.671967 4789 generic.go:334] "Generic (PLEG): container finished" podID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerID="ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5" exitCode=0 Dec 16 08:58:14 crc kubenswrapper[4789]: I1216 08:58:14.672020 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj7p7" event={"ID":"cfa7ec84-5458-47a2-b8d4-6ea6f670697c","Type":"ContainerDied","Data":"ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5"} Dec 16 08:58:14 crc kubenswrapper[4789]: I1216 08:58:14.674214 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 08:58:15 crc kubenswrapper[4789]: I1216 08:58:15.687387 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj7p7" event={"ID":"cfa7ec84-5458-47a2-b8d4-6ea6f670697c","Type":"ContainerStarted","Data":"2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61"} Dec 16 08:58:16 crc kubenswrapper[4789]: I1216 08:58:16.698469 4789 generic.go:334] "Generic (PLEG): container finished" podID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerID="2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61" exitCode=0 Dec 16 08:58:16 crc kubenswrapper[4789]: I1216 08:58:16.698502 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj7p7" event={"ID":"cfa7ec84-5458-47a2-b8d4-6ea6f670697c","Type":"ContainerDied","Data":"2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61"} Dec 16 08:58:17 crc kubenswrapper[4789]: I1216 08:58:17.106090 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:58:17 crc kubenswrapper[4789]: E1216 08:58:17.106508 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:58:18 crc kubenswrapper[4789]: I1216 08:58:18.720149 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj7p7" event={"ID":"cfa7ec84-5458-47a2-b8d4-6ea6f670697c","Type":"ContainerStarted","Data":"affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3"} Dec 16 08:58:18 crc kubenswrapper[4789]: I1216 08:58:18.750020 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gj7p7" podStartSLOduration=3.087273411 podStartE2EDuration="6.749995758s" podCreationTimestamp="2025-12-16 08:58:12 +0000 UTC" firstStartedPulling="2025-12-16 08:58:14.673968965 +0000 UTC m=+7632.935856594" lastFinishedPulling="2025-12-16 08:58:18.336691312 +0000 UTC m=+7636.598578941" observedRunningTime="2025-12-16 08:58:18.735569615 +0000 UTC m=+7636.997457244" watchObservedRunningTime="2025-12-16 08:58:18.749995758 +0000 UTC m=+7637.011883387" Dec 16 08:58:22 crc kubenswrapper[4789]: I1216 08:58:22.976634 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:22 crc kubenswrapper[4789]: I1216 08:58:22.977185 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:23 crc kubenswrapper[4789]: I1216 08:58:23.024513 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:23 crc kubenswrapper[4789]: I1216 08:58:23.818064 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:23 crc kubenswrapper[4789]: I1216 08:58:23.870800 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj7p7"] Dec 16 08:58:25 crc kubenswrapper[4789]: I1216 08:58:25.786369 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gj7p7" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="registry-server" containerID="cri-o://affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3" gracePeriod=2 Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.323936 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.333847 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-catalog-content\") pod \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.333940 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxvrq\" (UniqueName: \"kubernetes.io/projected/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-kube-api-access-kxvrq\") pod \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.334060 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-utilities\") pod \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\" (UID: \"cfa7ec84-5458-47a2-b8d4-6ea6f670697c\") " Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.334790 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-utilities" (OuterVolumeSpecName: "utilities") pod "cfa7ec84-5458-47a2-b8d4-6ea6f670697c" (UID: "cfa7ec84-5458-47a2-b8d4-6ea6f670697c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.339778 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-kube-api-access-kxvrq" (OuterVolumeSpecName: "kube-api-access-kxvrq") pod "cfa7ec84-5458-47a2-b8d4-6ea6f670697c" (UID: "cfa7ec84-5458-47a2-b8d4-6ea6f670697c"). InnerVolumeSpecName "kube-api-access-kxvrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.359509 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfa7ec84-5458-47a2-b8d4-6ea6f670697c" (UID: "cfa7ec84-5458-47a2-b8d4-6ea6f670697c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.435697 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.435733 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.435745 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxvrq\" (UniqueName: \"kubernetes.io/projected/cfa7ec84-5458-47a2-b8d4-6ea6f670697c-kube-api-access-kxvrq\") on node \"crc\" DevicePath \"\"" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.799114 4789 generic.go:334] "Generic (PLEG): container finished" podID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerID="affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3" exitCode=0 Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.799158 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj7p7" event={"ID":"cfa7ec84-5458-47a2-b8d4-6ea6f670697c","Type":"ContainerDied","Data":"affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3"} Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.799185 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj7p7" event={"ID":"cfa7ec84-5458-47a2-b8d4-6ea6f670697c","Type":"ContainerDied","Data":"f7e32aa60effc5771ce710e2288d85a60aabacdc80f81474a3e93e325a0d61d9"} Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.799203 4789 scope.go:117] "RemoveContainer" containerID="affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.799327 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj7p7" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.822803 4789 scope.go:117] "RemoveContainer" containerID="2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.835426 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj7p7"] Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.843843 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj7p7"] Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.864621 4789 scope.go:117] "RemoveContainer" containerID="ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.897111 4789 scope.go:117] "RemoveContainer" containerID="affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3" Dec 16 08:58:26 crc kubenswrapper[4789]: E1216 08:58:26.897586 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3\": container with ID starting with affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3 not found: ID does not exist" containerID="affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.897629 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3"} err="failed to get container status \"affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3\": rpc error: code = NotFound desc = could not find container \"affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3\": container with ID starting with affc1a23c92c1b5bfb464bfadcbb5077d2dbca72f40075b2a5101f7d394df2a3 not found: ID does not exist" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.897659 4789 scope.go:117] "RemoveContainer" containerID="2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61" Dec 16 08:58:26 crc kubenswrapper[4789]: E1216 08:58:26.898068 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61\": container with ID starting with 2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61 not found: ID does not exist" containerID="2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.898098 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61"} err="failed to get container status \"2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61\": rpc error: code = NotFound desc = could not find container \"2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61\": container with ID starting with 2ee07592f50126dfd7e84e77c449ba3ae1f53f17e07b8acdffc5e6d0f101dd61 not found: ID does not exist" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.898115 4789 scope.go:117] "RemoveContainer" containerID="ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5" Dec 16 08:58:26 crc kubenswrapper[4789]: E1216 08:58:26.898489 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5\": container with ID starting with ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5 not found: ID does not exist" containerID="ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5" Dec 16 08:58:26 crc kubenswrapper[4789]: I1216 08:58:26.898512 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5"} err="failed to get container status \"ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5\": rpc error: code = NotFound desc = could not find container \"ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5\": container with ID starting with ae817e58f9fa08e807a7265d08d059d64cd9d2143bc4c857ac941d569c5d18f5 not found: ID does not exist" Dec 16 08:58:28 crc kubenswrapper[4789]: I1216 08:58:28.105208 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:58:28 crc kubenswrapper[4789]: E1216 08:58:28.105718 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:58:28 crc kubenswrapper[4789]: I1216 08:58:28.125383 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" path="/var/lib/kubelet/pods/cfa7ec84-5458-47a2-b8d4-6ea6f670697c/volumes" Dec 16 08:58:41 crc kubenswrapper[4789]: I1216 08:58:41.105292 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:58:41 crc kubenswrapper[4789]: E1216 08:58:41.106112 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:58:55 crc kubenswrapper[4789]: I1216 08:58:55.105567 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:58:55 crc kubenswrapper[4789]: E1216 08:58:55.106617 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:59:09 crc kubenswrapper[4789]: I1216 08:59:09.105269 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:59:09 crc kubenswrapper[4789]: E1216 08:59:09.105931 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:59:13 crc kubenswrapper[4789]: I1216 08:59:13.907690 4789 generic.go:334] "Generic (PLEG): container finished" podID="f2844d2e-202c-470b-9bb9-cb0506134f3c" containerID="07483dd90372799568eec12dad54c395ad17af68731f7d352a451276a4eac780" exitCode=0 Dec 16 08:59:13 crc kubenswrapper[4789]: I1216 08:59:13.908372 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" event={"ID":"f2844d2e-202c-470b-9bb9-cb0506134f3c","Type":"ContainerDied","Data":"07483dd90372799568eec12dad54c395ad17af68731f7d352a451276a4eac780"} Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.372017 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.560437 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ceph\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.560578 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-1\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.560837 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ssh-key\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.560886 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-1\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.560943 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-inventory\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.560989 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-0\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.561060 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-0\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.561146 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-combined-ca-bundle\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.561838 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-1\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.562212 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-0\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.562282 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9rgd\" (UniqueName: \"kubernetes.io/projected/f2844d2e-202c-470b-9bb9-cb0506134f3c-kube-api-access-t9rgd\") pod \"f2844d2e-202c-470b-9bb9-cb0506134f3c\" (UID: \"f2844d2e-202c-470b-9bb9-cb0506134f3c\") " Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.567130 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2844d2e-202c-470b-9bb9-cb0506134f3c-kube-api-access-t9rgd" (OuterVolumeSpecName: "kube-api-access-t9rgd") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "kube-api-access-t9rgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.567259 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ceph" (OuterVolumeSpecName: "ceph") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.567366 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.587046 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.589563 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.593685 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.595626 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.600466 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.601736 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-inventory" (OuterVolumeSpecName: "inventory") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.603678 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.606865 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2844d2e-202c-470b-9bb9-cb0506134f3c" (UID: "f2844d2e-202c-470b-9bb9-cb0506134f3c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666786 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666830 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666843 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666853 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666862 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666871 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666879 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666893 4789 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666904 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9rgd\" (UniqueName: \"kubernetes.io/projected/f2844d2e-202c-470b-9bb9-cb0506134f3c-kube-api-access-t9rgd\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666932 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.666943 4789 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2844d2e-202c-470b-9bb9-cb0506134f3c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.929691 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" event={"ID":"f2844d2e-202c-470b-9bb9-cb0506134f3c","Type":"ContainerDied","Data":"411ef355a9b2ec7c41513b20238b2a7f7eb1313f8aaebd74aaaac965ab66f5e9"} Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.929757 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="411ef355a9b2ec7c41513b20238b2a7f7eb1313f8aaebd74aaaac965ab66f5e9" Dec 16 08:59:15 crc kubenswrapper[4789]: I1216 08:59:15.929945 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-c7bll" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.034443 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-zkjlb"] Dec 16 08:59:16 crc kubenswrapper[4789]: E1216 08:59:16.034853 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2844d2e-202c-470b-9bb9-cb0506134f3c" containerName="nova-cell1-openstack-openstack-cell1" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.034870 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2844d2e-202c-470b-9bb9-cb0506134f3c" containerName="nova-cell1-openstack-openstack-cell1" Dec 16 08:59:16 crc kubenswrapper[4789]: E1216 08:59:16.034887 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="extract-content" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.034893 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="extract-content" Dec 16 08:59:16 crc kubenswrapper[4789]: E1216 08:59:16.035781 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="registry-server" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.035802 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="registry-server" Dec 16 08:59:16 crc kubenswrapper[4789]: E1216 08:59:16.035826 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="extract-utilities" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.035834 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="extract-utilities" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.037339 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa7ec84-5458-47a2-b8d4-6ea6f670697c" containerName="registry-server" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.037407 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2844d2e-202c-470b-9bb9-cb0506134f3c" containerName="nova-cell1-openstack-openstack-cell1" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.039523 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.059816 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.060724 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.063442 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-zkjlb"] Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.063464 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.064006 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.064606 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.078835 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7825c\" (UniqueName: \"kubernetes.io/projected/ea52433e-1eda-40ec-8bb9-32652828eeec-kube-api-access-7825c\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.078949 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-inventory\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.079028 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.079065 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.079104 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.079164 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ssh-key\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.079209 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceph\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.079268 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.181686 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7825c\" (UniqueName: \"kubernetes.io/projected/ea52433e-1eda-40ec-8bb9-32652828eeec-kube-api-access-7825c\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.181751 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-inventory\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.181797 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.181817 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.181840 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.181871 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ssh-key\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.181898 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceph\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.182043 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.187855 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.188016 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.188623 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.189092 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ssh-key\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.189863 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-inventory\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.191635 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceph\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.195515 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.198946 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7825c\" (UniqueName: \"kubernetes.io/projected/ea52433e-1eda-40ec-8bb9-32652828eeec-kube-api-access-7825c\") pod \"telemetry-openstack-openstack-cell1-zkjlb\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.382947 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.915276 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-zkjlb"] Dec 16 08:59:16 crc kubenswrapper[4789]: I1216 08:59:16.939929 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" event={"ID":"ea52433e-1eda-40ec-8bb9-32652828eeec","Type":"ContainerStarted","Data":"d637b71d62ee85c103bfd4bfcadc4cd47e2df169d9e50d561859b85e21ab7a25"} Dec 16 08:59:17 crc kubenswrapper[4789]: I1216 08:59:17.949024 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" event={"ID":"ea52433e-1eda-40ec-8bb9-32652828eeec","Type":"ContainerStarted","Data":"dafaa112ba14fd1fe367c6ceb4210e128d6e1f398ccbb737f54e92627b06c27c"} Dec 16 08:59:17 crc kubenswrapper[4789]: I1216 08:59:17.973463 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" podStartSLOduration=1.431499369 podStartE2EDuration="1.973444884s" podCreationTimestamp="2025-12-16 08:59:16 +0000 UTC" firstStartedPulling="2025-12-16 08:59:16.924424396 +0000 UTC m=+7695.186312025" lastFinishedPulling="2025-12-16 08:59:17.466369911 +0000 UTC m=+7695.728257540" observedRunningTime="2025-12-16 08:59:17.968143115 +0000 UTC m=+7696.230030744" watchObservedRunningTime="2025-12-16 08:59:17.973444884 +0000 UTC m=+7696.235332503" Dec 16 08:59:24 crc kubenswrapper[4789]: I1216 08:59:24.106745 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:59:24 crc kubenswrapper[4789]: E1216 08:59:24.107899 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:59:39 crc kubenswrapper[4789]: I1216 08:59:39.105044 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:59:39 crc kubenswrapper[4789]: E1216 08:59:39.105957 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 08:59:53 crc kubenswrapper[4789]: I1216 08:59:53.105641 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 08:59:53 crc kubenswrapper[4789]: E1216 08:59:53.108577 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.185517 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w"] Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.191809 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.194394 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.197168 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.199566 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w"] Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.221892 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-config-volume\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.222781 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh4mn\" (UniqueName: \"kubernetes.io/projected/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-kube-api-access-fh4mn\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.222886 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-secret-volume\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.326644 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-secret-volume\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.326720 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh4mn\" (UniqueName: \"kubernetes.io/projected/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-kube-api-access-fh4mn\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.326852 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-config-volume\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.328064 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-config-volume\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.336439 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-secret-volume\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.347527 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh4mn\" (UniqueName: \"kubernetes.io/projected/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-kube-api-access-fh4mn\") pod \"collect-profiles-29431260-hw99w\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.516743 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.986130 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dddwl"] Dec 16 09:00:00 crc kubenswrapper[4789]: I1216 09:00:00.989225 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.006813 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w"] Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.034697 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dddwl"] Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.041048 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-catalog-content\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.041145 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-utilities\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.041241 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxwq\" (UniqueName: \"kubernetes.io/projected/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-kube-api-access-9sxwq\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.143517 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-catalog-content\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.143614 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-utilities\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.143722 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxwq\" (UniqueName: \"kubernetes.io/projected/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-kube-api-access-9sxwq\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.146020 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-catalog-content\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.147627 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-utilities\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.172517 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxwq\" (UniqueName: \"kubernetes.io/projected/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-kube-api-access-9sxwq\") pod \"certified-operators-dddwl\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.367515 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" event={"ID":"d6ce239e-2d5d-4b88-8a08-b1b617fddb20","Type":"ContainerStarted","Data":"ce9bb347e18af2bd36960e6ab951985a023ac91646d6c2ca087838a1081e8a32"} Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.367564 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" event={"ID":"d6ce239e-2d5d-4b88-8a08-b1b617fddb20","Type":"ContainerStarted","Data":"ed672638cba4bdcce625558f33b5d64e3bd14a3d78fb2158b210b5f01afd876f"} Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.391717 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.392790 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" podStartSLOduration=1.392767896 podStartE2EDuration="1.392767896s" podCreationTimestamp="2025-12-16 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:00:01.391274008 +0000 UTC m=+7739.653161637" watchObservedRunningTime="2025-12-16 09:00:01.392767896 +0000 UTC m=+7739.654655525" Dec 16 09:00:01 crc kubenswrapper[4789]: I1216 09:00:01.944294 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dddwl"] Dec 16 09:00:02 crc kubenswrapper[4789]: I1216 09:00:02.399492 4789 generic.go:334] "Generic (PLEG): container finished" podID="d6ce239e-2d5d-4b88-8a08-b1b617fddb20" containerID="ce9bb347e18af2bd36960e6ab951985a023ac91646d6c2ca087838a1081e8a32" exitCode=0 Dec 16 09:00:02 crc kubenswrapper[4789]: I1216 09:00:02.399950 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" event={"ID":"d6ce239e-2d5d-4b88-8a08-b1b617fddb20","Type":"ContainerDied","Data":"ce9bb347e18af2bd36960e6ab951985a023ac91646d6c2ca087838a1081e8a32"} Dec 16 09:00:02 crc kubenswrapper[4789]: I1216 09:00:02.405528 4789 generic.go:334] "Generic (PLEG): container finished" podID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerID="d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d" exitCode=0 Dec 16 09:00:02 crc kubenswrapper[4789]: I1216 09:00:02.405976 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddwl" event={"ID":"7955c1ac-3ed3-460a-a434-8d0a73a7d19a","Type":"ContainerDied","Data":"d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d"} Dec 16 09:00:02 crc kubenswrapper[4789]: I1216 09:00:02.406020 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddwl" event={"ID":"7955c1ac-3ed3-460a-a434-8d0a73a7d19a","Type":"ContainerStarted","Data":"127d1a558d6d3f831e312b236e2115ef5ec7ae6cbe0fa41019ad30a030b624ae"} Dec 16 09:00:03 crc kubenswrapper[4789]: I1216 09:00:03.789208 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:03 crc kubenswrapper[4789]: I1216 09:00:03.917206 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-config-volume\") pod \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " Dec 16 09:00:03 crc kubenswrapper[4789]: I1216 09:00:03.917330 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-secret-volume\") pod \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " Dec 16 09:00:03 crc kubenswrapper[4789]: I1216 09:00:03.917564 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh4mn\" (UniqueName: \"kubernetes.io/projected/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-kube-api-access-fh4mn\") pod \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\" (UID: \"d6ce239e-2d5d-4b88-8a08-b1b617fddb20\") " Dec 16 09:00:03 crc kubenswrapper[4789]: I1216 09:00:03.918121 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-config-volume" (OuterVolumeSpecName: "config-volume") pod "d6ce239e-2d5d-4b88-8a08-b1b617fddb20" (UID: "d6ce239e-2d5d-4b88-8a08-b1b617fddb20"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:00:03 crc kubenswrapper[4789]: I1216 09:00:03.922825 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d6ce239e-2d5d-4b88-8a08-b1b617fddb20" (UID: "d6ce239e-2d5d-4b88-8a08-b1b617fddb20"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:00:03 crc kubenswrapper[4789]: I1216 09:00:03.922957 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-kube-api-access-fh4mn" (OuterVolumeSpecName: "kube-api-access-fh4mn") pod "d6ce239e-2d5d-4b88-8a08-b1b617fddb20" (UID: "d6ce239e-2d5d-4b88-8a08-b1b617fddb20"). InnerVolumeSpecName "kube-api-access-fh4mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.021388 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh4mn\" (UniqueName: \"kubernetes.io/projected/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-kube-api-access-fh4mn\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.021459 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.021469 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6ce239e-2d5d-4b88-8a08-b1b617fddb20-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.105959 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 09:00:04 crc kubenswrapper[4789]: E1216 09:00:04.106301 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.425819 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" event={"ID":"d6ce239e-2d5d-4b88-8a08-b1b617fddb20","Type":"ContainerDied","Data":"ed672638cba4bdcce625558f33b5d64e3bd14a3d78fb2158b210b5f01afd876f"} Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.425858 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed672638cba4bdcce625558f33b5d64e3bd14a3d78fb2158b210b5f01afd876f" Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.425891 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431260-hw99w" Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.473318 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl"] Dec 16 09:00:04 crc kubenswrapper[4789]: I1216 09:00:04.483818 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431215-hlbrl"] Dec 16 09:00:06 crc kubenswrapper[4789]: I1216 09:00:06.118058 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d87b09-4747-4165-b443-e19c0dfbbec8" path="/var/lib/kubelet/pods/00d87b09-4747-4165-b443-e19c0dfbbec8/volumes" Dec 16 09:00:06 crc kubenswrapper[4789]: I1216 09:00:06.449877 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddwl" event={"ID":"7955c1ac-3ed3-460a-a434-8d0a73a7d19a","Type":"ContainerStarted","Data":"13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91"} Dec 16 09:00:07 crc kubenswrapper[4789]: I1216 09:00:07.461099 4789 generic.go:334] "Generic (PLEG): container finished" podID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerID="13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91" exitCode=0 Dec 16 09:00:07 crc kubenswrapper[4789]: I1216 09:00:07.461266 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddwl" event={"ID":"7955c1ac-3ed3-460a-a434-8d0a73a7d19a","Type":"ContainerDied","Data":"13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91"} Dec 16 09:00:08 crc kubenswrapper[4789]: I1216 09:00:08.473479 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddwl" event={"ID":"7955c1ac-3ed3-460a-a434-8d0a73a7d19a","Type":"ContainerStarted","Data":"9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe"} Dec 16 09:00:08 crc kubenswrapper[4789]: I1216 09:00:08.497231 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dddwl" podStartSLOduration=2.746724875 podStartE2EDuration="8.497207359s" podCreationTimestamp="2025-12-16 09:00:00 +0000 UTC" firstStartedPulling="2025-12-16 09:00:02.411781841 +0000 UTC m=+7740.673669470" lastFinishedPulling="2025-12-16 09:00:08.162264325 +0000 UTC m=+7746.424151954" observedRunningTime="2025-12-16 09:00:08.49316911 +0000 UTC m=+7746.755056749" watchObservedRunningTime="2025-12-16 09:00:08.497207359 +0000 UTC m=+7746.759094988" Dec 16 09:00:11 crc kubenswrapper[4789]: I1216 09:00:11.392400 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:11 crc kubenswrapper[4789]: I1216 09:00:11.392769 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:11 crc kubenswrapper[4789]: I1216 09:00:11.452940 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:15 crc kubenswrapper[4789]: I1216 09:00:15.105401 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 09:00:15 crc kubenswrapper[4789]: E1216 09:00:15.106275 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:00:21 crc kubenswrapper[4789]: I1216 09:00:21.440485 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:21 crc kubenswrapper[4789]: I1216 09:00:21.497418 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dddwl"] Dec 16 09:00:21 crc kubenswrapper[4789]: I1216 09:00:21.585677 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dddwl" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="registry-server" containerID="cri-o://9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe" gracePeriod=2 Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.130033 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.206570 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-utilities\") pod \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.207574 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxwq\" (UniqueName: \"kubernetes.io/projected/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-kube-api-access-9sxwq\") pod \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.207876 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-catalog-content\") pod \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\" (UID: \"7955c1ac-3ed3-460a-a434-8d0a73a7d19a\") " Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.208314 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-utilities" (OuterVolumeSpecName: "utilities") pod "7955c1ac-3ed3-460a-a434-8d0a73a7d19a" (UID: "7955c1ac-3ed3-460a-a434-8d0a73a7d19a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.208791 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.215803 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-kube-api-access-9sxwq" (OuterVolumeSpecName: "kube-api-access-9sxwq") pod "7955c1ac-3ed3-460a-a434-8d0a73a7d19a" (UID: "7955c1ac-3ed3-460a-a434-8d0a73a7d19a"). InnerVolumeSpecName "kube-api-access-9sxwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.269179 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7955c1ac-3ed3-460a-a434-8d0a73a7d19a" (UID: "7955c1ac-3ed3-460a-a434-8d0a73a7d19a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.310879 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.310932 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxwq\" (UniqueName: \"kubernetes.io/projected/7955c1ac-3ed3-460a-a434-8d0a73a7d19a-kube-api-access-9sxwq\") on node \"crc\" DevicePath \"\"" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.595844 4789 generic.go:334] "Generic (PLEG): container finished" podID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerID="9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe" exitCode=0 Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.595887 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddwl" event={"ID":"7955c1ac-3ed3-460a-a434-8d0a73a7d19a","Type":"ContainerDied","Data":"9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe"} Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.595926 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddwl" event={"ID":"7955c1ac-3ed3-460a-a434-8d0a73a7d19a","Type":"ContainerDied","Data":"127d1a558d6d3f831e312b236e2115ef5ec7ae6cbe0fa41019ad30a030b624ae"} Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.595944 4789 scope.go:117] "RemoveContainer" containerID="9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.595967 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dddwl" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.633789 4789 scope.go:117] "RemoveContainer" containerID="13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.635582 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dddwl"] Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.645930 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dddwl"] Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.659325 4789 scope.go:117] "RemoveContainer" containerID="d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.700210 4789 scope.go:117] "RemoveContainer" containerID="9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe" Dec 16 09:00:22 crc kubenswrapper[4789]: E1216 09:00:22.702725 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe\": container with ID starting with 9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe not found: ID does not exist" containerID="9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.702771 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe"} err="failed to get container status \"9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe\": rpc error: code = NotFound desc = could not find container \"9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe\": container with ID starting with 9b8bb710f2103853d8800ae1d51bcc77632993acab1e2c2347d1de138befbdfe not found: ID does not exist" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.702797 4789 scope.go:117] "RemoveContainer" containerID="13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91" Dec 16 09:00:22 crc kubenswrapper[4789]: E1216 09:00:22.703767 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91\": container with ID starting with 13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91 not found: ID does not exist" containerID="13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.703798 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91"} err="failed to get container status \"13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91\": rpc error: code = NotFound desc = could not find container \"13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91\": container with ID starting with 13f869a8836a5b462d66f53c68f41969b7007d4c284ae98c016d9c633aac2c91 not found: ID does not exist" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.703820 4789 scope.go:117] "RemoveContainer" containerID="d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d" Dec 16 09:00:22 crc kubenswrapper[4789]: E1216 09:00:22.704304 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d\": container with ID starting with d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d not found: ID does not exist" containerID="d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d" Dec 16 09:00:22 crc kubenswrapper[4789]: I1216 09:00:22.704339 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d"} err="failed to get container status \"d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d\": rpc error: code = NotFound desc = could not find container \"d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d\": container with ID starting with d1f33ffbc1240cad20a6060999bc2ae8371b629fe730a8eab0f1c47bffd3766d not found: ID does not exist" Dec 16 09:00:24 crc kubenswrapper[4789]: I1216 09:00:24.118434 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" path="/var/lib/kubelet/pods/7955c1ac-3ed3-460a-a434-8d0a73a7d19a/volumes" Dec 16 09:00:29 crc kubenswrapper[4789]: I1216 09:00:29.105560 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 09:00:29 crc kubenswrapper[4789]: I1216 09:00:29.659748 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"51fc0824d67e7b6f11ebf37c8b906bff4c8102cb814347e2bde3b2da83a60a9b"} Dec 16 09:00:34 crc kubenswrapper[4789]: I1216 09:00:34.481907 4789 scope.go:117] "RemoveContainer" containerID="4f25896a9c7656d4842b0c5701e84dca548fe0d82e5d6e47a792b8c468f56803" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.148373 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29431261-m4757"] Dec 16 09:01:00 crc kubenswrapper[4789]: E1216 09:01:00.149439 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ce239e-2d5d-4b88-8a08-b1b617fddb20" containerName="collect-profiles" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.149454 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ce239e-2d5d-4b88-8a08-b1b617fddb20" containerName="collect-profiles" Dec 16 09:01:00 crc kubenswrapper[4789]: E1216 09:01:00.149476 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="registry-server" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.149486 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="registry-server" Dec 16 09:01:00 crc kubenswrapper[4789]: E1216 09:01:00.149500 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="extract-utilities" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.149508 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="extract-utilities" Dec 16 09:01:00 crc kubenswrapper[4789]: E1216 09:01:00.149524 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="extract-content" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.149530 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="extract-content" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.149769 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ce239e-2d5d-4b88-8a08-b1b617fddb20" containerName="collect-profiles" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.149789 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="7955c1ac-3ed3-460a-a434-8d0a73a7d19a" containerName="registry-server" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.150674 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.162534 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431261-m4757"] Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.266243 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6rlc\" (UniqueName: \"kubernetes.io/projected/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-kube-api-access-q6rlc\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.266319 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-fernet-keys\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.266405 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-config-data\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.266717 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-combined-ca-bundle\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.368362 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6rlc\" (UniqueName: \"kubernetes.io/projected/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-kube-api-access-q6rlc\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.368439 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-fernet-keys\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.368489 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-config-data\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.368635 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-combined-ca-bundle\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.375576 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-fernet-keys\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.375632 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-config-data\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.376059 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-combined-ca-bundle\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.386168 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6rlc\" (UniqueName: \"kubernetes.io/projected/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-kube-api-access-q6rlc\") pod \"keystone-cron-29431261-m4757\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.481324 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.949396 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431261-m4757"] Dec 16 09:01:00 crc kubenswrapper[4789]: I1216 09:01:00.987068 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-m4757" event={"ID":"8d1fba60-e7e3-4cb8-9b09-859d467c1f62","Type":"ContainerStarted","Data":"db5a8b7a0e7e56504b834b84a5422e7e2e0c12a1a565774986ae14af6b662fe7"} Dec 16 09:01:01 crc kubenswrapper[4789]: I1216 09:01:01.997675 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-m4757" event={"ID":"8d1fba60-e7e3-4cb8-9b09-859d467c1f62","Type":"ContainerStarted","Data":"0cf66ce3763dc496cd925bb5c45c12a2d96dbebc63048d03b691193371412311"} Dec 16 09:01:02 crc kubenswrapper[4789]: I1216 09:01:02.024535 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29431261-m4757" podStartSLOduration=2.024510237 podStartE2EDuration="2.024510237s" podCreationTimestamp="2025-12-16 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:01:02.013686103 +0000 UTC m=+7800.275573732" watchObservedRunningTime="2025-12-16 09:01:02.024510237 +0000 UTC m=+7800.286397866" Dec 16 09:01:04 crc kubenswrapper[4789]: I1216 09:01:04.018454 4789 generic.go:334] "Generic (PLEG): container finished" podID="8d1fba60-e7e3-4cb8-9b09-859d467c1f62" containerID="0cf66ce3763dc496cd925bb5c45c12a2d96dbebc63048d03b691193371412311" exitCode=0 Dec 16 09:01:04 crc kubenswrapper[4789]: I1216 09:01:04.018535 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-m4757" event={"ID":"8d1fba60-e7e3-4cb8-9b09-859d467c1f62","Type":"ContainerDied","Data":"0cf66ce3763dc496cd925bb5c45c12a2d96dbebc63048d03b691193371412311"} Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.399284 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.480578 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6rlc\" (UniqueName: \"kubernetes.io/projected/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-kube-api-access-q6rlc\") pod \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.480775 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-combined-ca-bundle\") pod \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.480933 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-fernet-keys\") pod \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.480971 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-config-data\") pod \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\" (UID: \"8d1fba60-e7e3-4cb8-9b09-859d467c1f62\") " Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.487428 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-kube-api-access-q6rlc" (OuterVolumeSpecName: "kube-api-access-q6rlc") pod "8d1fba60-e7e3-4cb8-9b09-859d467c1f62" (UID: "8d1fba60-e7e3-4cb8-9b09-859d467c1f62"). InnerVolumeSpecName "kube-api-access-q6rlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.487705 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8d1fba60-e7e3-4cb8-9b09-859d467c1f62" (UID: "8d1fba60-e7e3-4cb8-9b09-859d467c1f62"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.509213 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d1fba60-e7e3-4cb8-9b09-859d467c1f62" (UID: "8d1fba60-e7e3-4cb8-9b09-859d467c1f62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.533455 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-config-data" (OuterVolumeSpecName: "config-data") pod "8d1fba60-e7e3-4cb8-9b09-859d467c1f62" (UID: "8d1fba60-e7e3-4cb8-9b09-859d467c1f62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.583343 4789 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.583387 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.583400 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6rlc\" (UniqueName: \"kubernetes.io/projected/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-kube-api-access-q6rlc\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:05 crc kubenswrapper[4789]: I1216 09:01:05.583413 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1fba60-e7e3-4cb8-9b09-859d467c1f62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:01:06 crc kubenswrapper[4789]: I1216 09:01:06.040350 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431261-m4757" event={"ID":"8d1fba60-e7e3-4cb8-9b09-859d467c1f62","Type":"ContainerDied","Data":"db5a8b7a0e7e56504b834b84a5422e7e2e0c12a1a565774986ae14af6b662fe7"} Dec 16 09:01:06 crc kubenswrapper[4789]: I1216 09:01:06.040722 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db5a8b7a0e7e56504b834b84a5422e7e2e0c12a1a565774986ae14af6b662fe7" Dec 16 09:01:06 crc kubenswrapper[4789]: I1216 09:01:06.040616 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431261-m4757" Dec 16 09:02:51 crc kubenswrapper[4789]: I1216 09:02:51.928299 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:02:51 crc kubenswrapper[4789]: I1216 09:02:51.928972 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:03:21 crc kubenswrapper[4789]: I1216 09:03:21.927756 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:03:21 crc kubenswrapper[4789]: I1216 09:03:21.928315 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.047069 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nszhk"] Dec 16 09:03:50 crc kubenswrapper[4789]: E1216 09:03:50.048037 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1fba60-e7e3-4cb8-9b09-859d467c1f62" containerName="keystone-cron" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.048052 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1fba60-e7e3-4cb8-9b09-859d467c1f62" containerName="keystone-cron" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.048235 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1fba60-e7e3-4cb8-9b09-859d467c1f62" containerName="keystone-cron" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.050276 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.078121 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nszhk"] Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.185124 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-utilities\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.185166 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9qtb\" (UniqueName: \"kubernetes.io/projected/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-kube-api-access-j9qtb\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.185202 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-catalog-content\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.287677 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-utilities\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.288357 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9qtb\" (UniqueName: \"kubernetes.io/projected/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-kube-api-access-j9qtb\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.288285 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-utilities\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.288456 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-catalog-content\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.288812 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-catalog-content\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.312387 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9qtb\" (UniqueName: \"kubernetes.io/projected/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-kube-api-access-j9qtb\") pod \"community-operators-nszhk\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.368837 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:03:50 crc kubenswrapper[4789]: I1216 09:03:50.943346 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nszhk"] Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.809726 4789 generic.go:334] "Generic (PLEG): container finished" podID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerID="044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19" exitCode=0 Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.809842 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszhk" event={"ID":"5cca32d7-6960-4a5b-9ee6-28a898dcddb0","Type":"ContainerDied","Data":"044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19"} Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.810204 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszhk" event={"ID":"5cca32d7-6960-4a5b-9ee6-28a898dcddb0","Type":"ContainerStarted","Data":"ee5cb850e9bc2abb647cc0beed8ae8c62e5bf52b70abf2ee134aa4e6a3229ecf"} Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.811746 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.928629 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.928698 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.928747 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.929614 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51fc0824d67e7b6f11ebf37c8b906bff4c8102cb814347e2bde3b2da83a60a9b"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:03:51 crc kubenswrapper[4789]: I1216 09:03:51.929671 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://51fc0824d67e7b6f11ebf37c8b906bff4c8102cb814347e2bde3b2da83a60a9b" gracePeriod=600 Dec 16 09:03:52 crc kubenswrapper[4789]: I1216 09:03:52.820541 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="51fc0824d67e7b6f11ebf37c8b906bff4c8102cb814347e2bde3b2da83a60a9b" exitCode=0 Dec 16 09:03:52 crc kubenswrapper[4789]: I1216 09:03:52.820610 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"51fc0824d67e7b6f11ebf37c8b906bff4c8102cb814347e2bde3b2da83a60a9b"} Dec 16 09:03:52 crc kubenswrapper[4789]: I1216 09:03:52.821234 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359"} Dec 16 09:03:52 crc kubenswrapper[4789]: I1216 09:03:52.821256 4789 scope.go:117] "RemoveContainer" containerID="1bf4b3312e7240b30281c8b10e280d6394b38af76ca634c9ff660dd7a891990e" Dec 16 09:03:55 crc kubenswrapper[4789]: I1216 09:03:55.852095 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszhk" event={"ID":"5cca32d7-6960-4a5b-9ee6-28a898dcddb0","Type":"ContainerStarted","Data":"b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335"} Dec 16 09:03:57 crc kubenswrapper[4789]: I1216 09:03:57.885558 4789 generic.go:334] "Generic (PLEG): container finished" podID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerID="b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335" exitCode=0 Dec 16 09:03:57 crc kubenswrapper[4789]: I1216 09:03:57.885656 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszhk" event={"ID":"5cca32d7-6960-4a5b-9ee6-28a898dcddb0","Type":"ContainerDied","Data":"b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335"} Dec 16 09:03:58 crc kubenswrapper[4789]: I1216 09:03:58.896868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszhk" event={"ID":"5cca32d7-6960-4a5b-9ee6-28a898dcddb0","Type":"ContainerStarted","Data":"92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca"} Dec 16 09:03:58 crc kubenswrapper[4789]: I1216 09:03:58.916508 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nszhk" podStartSLOduration=2.253097248 podStartE2EDuration="8.916488657s" podCreationTimestamp="2025-12-16 09:03:50 +0000 UTC" firstStartedPulling="2025-12-16 09:03:51.811481192 +0000 UTC m=+7970.073368821" lastFinishedPulling="2025-12-16 09:03:58.474872601 +0000 UTC m=+7976.736760230" observedRunningTime="2025-12-16 09:03:58.913146856 +0000 UTC m=+7977.175034485" watchObservedRunningTime="2025-12-16 09:03:58.916488657 +0000 UTC m=+7977.178376286" Dec 16 09:04:00 crc kubenswrapper[4789]: I1216 09:04:00.369124 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:04:00 crc kubenswrapper[4789]: I1216 09:04:00.369439 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:04:00 crc kubenswrapper[4789]: I1216 09:04:00.418349 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:04:10 crc kubenswrapper[4789]: I1216 09:04:10.419789 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:04:10 crc kubenswrapper[4789]: I1216 09:04:10.466887 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nszhk"] Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.002294 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nszhk" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="registry-server" containerID="cri-o://92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca" gracePeriod=2 Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.479075 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.532905 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-utilities\") pod \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.533706 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-utilities" (OuterVolumeSpecName: "utilities") pod "5cca32d7-6960-4a5b-9ee6-28a898dcddb0" (UID: "5cca32d7-6960-4a5b-9ee6-28a898dcddb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.534029 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9qtb\" (UniqueName: \"kubernetes.io/projected/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-kube-api-access-j9qtb\") pod \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.534092 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-catalog-content\") pod \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\" (UID: \"5cca32d7-6960-4a5b-9ee6-28a898dcddb0\") " Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.535191 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.539556 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-kube-api-access-j9qtb" (OuterVolumeSpecName: "kube-api-access-j9qtb") pod "5cca32d7-6960-4a5b-9ee6-28a898dcddb0" (UID: "5cca32d7-6960-4a5b-9ee6-28a898dcddb0"). InnerVolumeSpecName "kube-api-access-j9qtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.582295 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cca32d7-6960-4a5b-9ee6-28a898dcddb0" (UID: "5cca32d7-6960-4a5b-9ee6-28a898dcddb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.637380 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9qtb\" (UniqueName: \"kubernetes.io/projected/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-kube-api-access-j9qtb\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:11 crc kubenswrapper[4789]: I1216 09:04:11.637776 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cca32d7-6960-4a5b-9ee6-28a898dcddb0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.027081 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszhk" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.045184 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszhk" event={"ID":"5cca32d7-6960-4a5b-9ee6-28a898dcddb0","Type":"ContainerDied","Data":"92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca"} Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.045312 4789 generic.go:334] "Generic (PLEG): container finished" podID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerID="92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca" exitCode=0 Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.045374 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszhk" event={"ID":"5cca32d7-6960-4a5b-9ee6-28a898dcddb0","Type":"ContainerDied","Data":"ee5cb850e9bc2abb647cc0beed8ae8c62e5bf52b70abf2ee134aa4e6a3229ecf"} Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.045415 4789 scope.go:117] "RemoveContainer" containerID="92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.075092 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nszhk"] Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.082659 4789 scope.go:117] "RemoveContainer" containerID="b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.085853 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nszhk"] Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.103731 4789 scope.go:117] "RemoveContainer" containerID="044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.117119 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" path="/var/lib/kubelet/pods/5cca32d7-6960-4a5b-9ee6-28a898dcddb0/volumes" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.150743 4789 scope.go:117] "RemoveContainer" containerID="92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca" Dec 16 09:04:12 crc kubenswrapper[4789]: E1216 09:04:12.153614 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca\": container with ID starting with 92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca not found: ID does not exist" containerID="92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.154577 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca"} err="failed to get container status \"92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca\": rpc error: code = NotFound desc = could not find container \"92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca\": container with ID starting with 92394a61756efd42012416f6d6e7755802b074529d6cedf0fc8488a3d789adca not found: ID does not exist" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.154608 4789 scope.go:117] "RemoveContainer" containerID="b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335" Dec 16 09:04:12 crc kubenswrapper[4789]: E1216 09:04:12.155205 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335\": container with ID starting with b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335 not found: ID does not exist" containerID="b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.155240 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335"} err="failed to get container status \"b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335\": rpc error: code = NotFound desc = could not find container \"b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335\": container with ID starting with b9357d0314f9db9cc6d16230351515031b5507deb3fff645bc7b3823cba51335 not found: ID does not exist" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.155258 4789 scope.go:117] "RemoveContainer" containerID="044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19" Dec 16 09:04:12 crc kubenswrapper[4789]: E1216 09:04:12.155511 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19\": container with ID starting with 044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19 not found: ID does not exist" containerID="044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19" Dec 16 09:04:12 crc kubenswrapper[4789]: I1216 09:04:12.155537 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19"} err="failed to get container status \"044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19\": rpc error: code = NotFound desc = could not find container \"044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19\": container with ID starting with 044fe912eebd52764a5a325e5323cc5575ac7bb3f50405547ed5f5acd6f73a19 not found: ID does not exist" Dec 16 09:04:35 crc kubenswrapper[4789]: I1216 09:04:35.273367 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea52433e-1eda-40ec-8bb9-32652828eeec" containerID="dafaa112ba14fd1fe367c6ceb4210e128d6e1f398ccbb737f54e92627b06c27c" exitCode=0 Dec 16 09:04:35 crc kubenswrapper[4789]: I1216 09:04:35.273446 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" event={"ID":"ea52433e-1eda-40ec-8bb9-32652828eeec","Type":"ContainerDied","Data":"dafaa112ba14fd1fe367c6ceb4210e128d6e1f398ccbb737f54e92627b06c27c"} Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.732632 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756069 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-0\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756232 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ssh-key\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756268 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-2\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-telemetry-combined-ca-bundle\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756338 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-1\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756385 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-inventory\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756412 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7825c\" (UniqueName: \"kubernetes.io/projected/ea52433e-1eda-40ec-8bb9-32652828eeec-kube-api-access-7825c\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.756454 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceph\") pod \"ea52433e-1eda-40ec-8bb9-32652828eeec\" (UID: \"ea52433e-1eda-40ec-8bb9-32652828eeec\") " Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.762090 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.762133 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea52433e-1eda-40ec-8bb9-32652828eeec-kube-api-access-7825c" (OuterVolumeSpecName: "kube-api-access-7825c") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "kube-api-access-7825c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.772979 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceph" (OuterVolumeSpecName: "ceph") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.785829 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.788073 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-inventory" (OuterVolumeSpecName: "inventory") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.789260 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.796082 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.805080 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ea52433e-1eda-40ec-8bb9-32652828eeec" (UID: "ea52433e-1eda-40ec-8bb9-32652828eeec"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858576 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858601 4789 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858613 4789 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858624 4789 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858635 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858644 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7825c\" (UniqueName: \"kubernetes.io/projected/ea52433e-1eda-40ec-8bb9-32652828eeec-kube-api-access-7825c\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858659 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:36 crc kubenswrapper[4789]: I1216 09:04:36.858668 4789 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ea52433e-1eda-40ec-8bb9-32652828eeec-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.293780 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" event={"ID":"ea52433e-1eda-40ec-8bb9-32652828eeec","Type":"ContainerDied","Data":"d637b71d62ee85c103bfd4bfcadc4cd47e2df169d9e50d561859b85e21ab7a25"} Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.294117 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d637b71d62ee85c103bfd4bfcadc4cd47e2df169d9e50d561859b85e21ab7a25" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.293836 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-zkjlb" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.384467 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-l888x"] Dec 16 09:04:37 crc kubenswrapper[4789]: E1216 09:04:37.385042 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea52433e-1eda-40ec-8bb9-32652828eeec" containerName="telemetry-openstack-openstack-cell1" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.385069 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea52433e-1eda-40ec-8bb9-32652828eeec" containerName="telemetry-openstack-openstack-cell1" Dec 16 09:04:37 crc kubenswrapper[4789]: E1216 09:04:37.385110 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="extract-content" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.385118 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="extract-content" Dec 16 09:04:37 crc kubenswrapper[4789]: E1216 09:04:37.385136 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="registry-server" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.385143 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="registry-server" Dec 16 09:04:37 crc kubenswrapper[4789]: E1216 09:04:37.385157 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="extract-utilities" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.385165 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="extract-utilities" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.385384 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea52433e-1eda-40ec-8bb9-32652828eeec" containerName="telemetry-openstack-openstack-cell1" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.385396 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cca32d7-6960-4a5b-9ee6-28a898dcddb0" containerName="registry-server" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.386321 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.390599 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.391401 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.391807 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.392388 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.397937 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-l888x"] Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.401477 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.576084 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.576242 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.576319 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.576419 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr7j4\" (UniqueName: \"kubernetes.io/projected/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-kube-api-access-zr7j4\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.576502 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.576544 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.678132 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.678266 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.678378 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.678484 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.678518 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr7j4\" (UniqueName: \"kubernetes.io/projected/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-kube-api-access-zr7j4\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.678562 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.682711 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.682739 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.682783 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.682872 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.688778 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.697684 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr7j4\" (UniqueName: \"kubernetes.io/projected/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-kube-api-access-zr7j4\") pod \"neutron-sriov-openstack-openstack-cell1-l888x\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:37 crc kubenswrapper[4789]: I1216 09:04:37.712786 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:04:38 crc kubenswrapper[4789]: I1216 09:04:38.308926 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-l888x"] Dec 16 09:04:39 crc kubenswrapper[4789]: I1216 09:04:39.315240 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" event={"ID":"4aed6986-dbe5-45bd-84e6-a1e31c1a89be","Type":"ContainerStarted","Data":"b8dc164f4dba78808ec61124954648d0adb099f09c1ca89569e2cfc5696177e2"} Dec 16 09:04:39 crc kubenswrapper[4789]: I1216 09:04:39.316672 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" event={"ID":"4aed6986-dbe5-45bd-84e6-a1e31c1a89be","Type":"ContainerStarted","Data":"df57771882dd7e933a3289e01b874686374496ba8213e07e8bcf7cebe16d4722"} Dec 16 09:04:39 crc kubenswrapper[4789]: I1216 09:04:39.337644 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" podStartSLOduration=1.866604504 podStartE2EDuration="2.337624017s" podCreationTimestamp="2025-12-16 09:04:37 +0000 UTC" firstStartedPulling="2025-12-16 09:04:38.315552747 +0000 UTC m=+8016.577440366" lastFinishedPulling="2025-12-16 09:04:38.78657225 +0000 UTC m=+8017.048459879" observedRunningTime="2025-12-16 09:04:39.333736532 +0000 UTC m=+8017.595624161" watchObservedRunningTime="2025-12-16 09:04:39.337624017 +0000 UTC m=+8017.599511646" Dec 16 09:06:21 crc kubenswrapper[4789]: I1216 09:06:21.928448 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:06:21 crc kubenswrapper[4789]: I1216 09:06:21.929052 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:06:51 crc kubenswrapper[4789]: I1216 09:06:51.927646 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:06:51 crc kubenswrapper[4789]: I1216 09:06:51.928240 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:07:00 crc kubenswrapper[4789]: I1216 09:07:00.691179 4789 generic.go:334] "Generic (PLEG): container finished" podID="4aed6986-dbe5-45bd-84e6-a1e31c1a89be" containerID="b8dc164f4dba78808ec61124954648d0adb099f09c1ca89569e2cfc5696177e2" exitCode=0 Dec 16 09:07:00 crc kubenswrapper[4789]: I1216 09:07:00.691246 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" event={"ID":"4aed6986-dbe5-45bd-84e6-a1e31c1a89be","Type":"ContainerDied","Data":"b8dc164f4dba78808ec61124954648d0adb099f09c1ca89569e2cfc5696177e2"} Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.186637 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.230819 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr7j4\" (UniqueName: \"kubernetes.io/projected/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-kube-api-access-zr7j4\") pod \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.230902 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-combined-ca-bundle\") pod \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.230970 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-agent-neutron-config-0\") pod \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.231457 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ceph\") pod \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.231494 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ssh-key\") pod \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.231623 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-inventory\") pod \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\" (UID: \"4aed6986-dbe5-45bd-84e6-a1e31c1a89be\") " Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.249792 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "4aed6986-dbe5-45bd-84e6-a1e31c1a89be" (UID: "4aed6986-dbe5-45bd-84e6-a1e31c1a89be"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.249863 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ceph" (OuterVolumeSpecName: "ceph") pod "4aed6986-dbe5-45bd-84e6-a1e31c1a89be" (UID: "4aed6986-dbe5-45bd-84e6-a1e31c1a89be"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.253109 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-kube-api-access-zr7j4" (OuterVolumeSpecName: "kube-api-access-zr7j4") pod "4aed6986-dbe5-45bd-84e6-a1e31c1a89be" (UID: "4aed6986-dbe5-45bd-84e6-a1e31c1a89be"). InnerVolumeSpecName "kube-api-access-zr7j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.263208 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-inventory" (OuterVolumeSpecName: "inventory") pod "4aed6986-dbe5-45bd-84e6-a1e31c1a89be" (UID: "4aed6986-dbe5-45bd-84e6-a1e31c1a89be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.266239 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4aed6986-dbe5-45bd-84e6-a1e31c1a89be" (UID: "4aed6986-dbe5-45bd-84e6-a1e31c1a89be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.269832 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "4aed6986-dbe5-45bd-84e6-a1e31c1a89be" (UID: "4aed6986-dbe5-45bd-84e6-a1e31c1a89be"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.334439 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr7j4\" (UniqueName: \"kubernetes.io/projected/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-kube-api-access-zr7j4\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.334482 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.334498 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.334511 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.334522 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.334531 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4aed6986-dbe5-45bd-84e6-a1e31c1a89be-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.712463 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" event={"ID":"4aed6986-dbe5-45bd-84e6-a1e31c1a89be","Type":"ContainerDied","Data":"df57771882dd7e933a3289e01b874686374496ba8213e07e8bcf7cebe16d4722"} Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.712509 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df57771882dd7e933a3289e01b874686374496ba8213e07e8bcf7cebe16d4722" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.712513 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-l888x" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.831021 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-h74df"] Dec 16 09:07:02 crc kubenswrapper[4789]: E1216 09:07:02.831854 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aed6986-dbe5-45bd-84e6-a1e31c1a89be" containerName="neutron-sriov-openstack-openstack-cell1" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.831869 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aed6986-dbe5-45bd-84e6-a1e31c1a89be" containerName="neutron-sriov-openstack-openstack-cell1" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.832150 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aed6986-dbe5-45bd-84e6-a1e31c1a89be" containerName="neutron-sriov-openstack-openstack-cell1" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.833021 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.837585 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.837838 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.837897 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.839152 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.839679 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.848395 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-h74df"] Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.946330 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.946380 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.946432 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.946446 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.946546 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5q52\" (UniqueName: \"kubernetes.io/projected/ffb74c16-ff91-4d62-a5ea-c381a02c0768-kube-api-access-h5q52\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:02 crc kubenswrapper[4789]: I1216 09:07:02.946566 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.048882 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5q52\" (UniqueName: \"kubernetes.io/projected/ffb74c16-ff91-4d62-a5ea-c381a02c0768-kube-api-access-h5q52\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.049272 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.050107 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.050146 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.050261 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.050283 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.053857 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.054435 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.055665 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.058237 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.061422 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.072209 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5q52\" (UniqueName: \"kubernetes.io/projected/ffb74c16-ff91-4d62-a5ea-c381a02c0768-kube-api-access-h5q52\") pod \"neutron-dhcp-openstack-openstack-cell1-h74df\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.151784 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:07:03 crc kubenswrapper[4789]: I1216 09:07:03.730814 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-h74df"] Dec 16 09:07:04 crc kubenswrapper[4789]: I1216 09:07:04.779399 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" event={"ID":"ffb74c16-ff91-4d62-a5ea-c381a02c0768","Type":"ContainerStarted","Data":"55980e2e0e75a0f09878d29b3ee686706270fdeb91c770ae8832c5ae65ebe628"} Dec 16 09:07:04 crc kubenswrapper[4789]: I1216 09:07:04.780057 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" event={"ID":"ffb74c16-ff91-4d62-a5ea-c381a02c0768","Type":"ContainerStarted","Data":"53ffc4bed7f1fbffa71ac65fbca28952cd0222b38d99f90a036845479de8949b"} Dec 16 09:07:04 crc kubenswrapper[4789]: I1216 09:07:04.806963 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" podStartSLOduration=2.343602359 podStartE2EDuration="2.806938495s" podCreationTimestamp="2025-12-16 09:07:02 +0000 UTC" firstStartedPulling="2025-12-16 09:07:03.73960857 +0000 UTC m=+8162.001496199" lastFinishedPulling="2025-12-16 09:07:04.202944706 +0000 UTC m=+8162.464832335" observedRunningTime="2025-12-16 09:07:04.797686829 +0000 UTC m=+8163.059574478" watchObservedRunningTime="2025-12-16 09:07:04.806938495 +0000 UTC m=+8163.068826134" Dec 16 09:07:21 crc kubenswrapper[4789]: I1216 09:07:21.927684 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:07:21 crc kubenswrapper[4789]: I1216 09:07:21.928328 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:07:21 crc kubenswrapper[4789]: I1216 09:07:21.928382 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 09:07:21 crc kubenswrapper[4789]: I1216 09:07:21.929259 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:07:21 crc kubenswrapper[4789]: I1216 09:07:21.929322 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" gracePeriod=600 Dec 16 09:07:22 crc kubenswrapper[4789]: E1216 09:07:22.661594 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:07:22 crc kubenswrapper[4789]: I1216 09:07:22.950191 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" exitCode=0 Dec 16 09:07:22 crc kubenswrapper[4789]: I1216 09:07:22.950250 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359"} Dec 16 09:07:22 crc kubenswrapper[4789]: I1216 09:07:22.950283 4789 scope.go:117] "RemoveContainer" containerID="51fc0824d67e7b6f11ebf37c8b906bff4c8102cb814347e2bde3b2da83a60a9b" Dec 16 09:07:22 crc kubenswrapper[4789]: I1216 09:07:22.951332 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:07:22 crc kubenswrapper[4789]: E1216 09:07:22.952062 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:07:35 crc kubenswrapper[4789]: I1216 09:07:35.105130 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:07:35 crc kubenswrapper[4789]: E1216 09:07:35.105852 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:07:48 crc kubenswrapper[4789]: I1216 09:07:48.106111 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:07:48 crc kubenswrapper[4789]: E1216 09:07:48.106894 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:08:01 crc kubenswrapper[4789]: I1216 09:08:01.104722 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:08:01 crc kubenswrapper[4789]: E1216 09:08:01.105655 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:08:14 crc kubenswrapper[4789]: I1216 09:08:14.105505 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:08:14 crc kubenswrapper[4789]: E1216 09:08:14.106420 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.012959 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txsmb"] Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.016181 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.060672 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txsmb"] Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.208495 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-catalog-content\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.208553 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-utilities\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.209090 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cjfz\" (UniqueName: \"kubernetes.io/projected/e33daf95-8852-4a9d-a2fc-40d10a3e5303-kube-api-access-9cjfz\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.311786 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cjfz\" (UniqueName: \"kubernetes.io/projected/e33daf95-8852-4a9d-a2fc-40d10a3e5303-kube-api-access-9cjfz\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.312175 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-catalog-content\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.312202 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-utilities\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.312703 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-catalog-content\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.312841 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-utilities\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.345130 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cjfz\" (UniqueName: \"kubernetes.io/projected/e33daf95-8852-4a9d-a2fc-40d10a3e5303-kube-api-access-9cjfz\") pod \"redhat-marketplace-txsmb\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.363973 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:24 crc kubenswrapper[4789]: I1216 09:08:24.893027 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txsmb"] Dec 16 09:08:25 crc kubenswrapper[4789]: I1216 09:08:25.562292 4789 generic.go:334] "Generic (PLEG): container finished" podID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerID="8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560" exitCode=0 Dec 16 09:08:25 crc kubenswrapper[4789]: I1216 09:08:25.562359 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txsmb" event={"ID":"e33daf95-8852-4a9d-a2fc-40d10a3e5303","Type":"ContainerDied","Data":"8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560"} Dec 16 09:08:25 crc kubenswrapper[4789]: I1216 09:08:25.562581 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txsmb" event={"ID":"e33daf95-8852-4a9d-a2fc-40d10a3e5303","Type":"ContainerStarted","Data":"7f3c6a4a8b8d5b8689efb32b77ee833aa9a7f503ea3ed07c9a090f57920c4ef9"} Dec 16 09:08:27 crc kubenswrapper[4789]: I1216 09:08:27.580522 4789 generic.go:334] "Generic (PLEG): container finished" podID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerID="91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382" exitCode=0 Dec 16 09:08:27 crc kubenswrapper[4789]: I1216 09:08:27.580610 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txsmb" event={"ID":"e33daf95-8852-4a9d-a2fc-40d10a3e5303","Type":"ContainerDied","Data":"91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382"} Dec 16 09:08:28 crc kubenswrapper[4789]: I1216 09:08:28.105945 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:08:28 crc kubenswrapper[4789]: E1216 09:08:28.106451 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:08:28 crc kubenswrapper[4789]: I1216 09:08:28.591499 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txsmb" event={"ID":"e33daf95-8852-4a9d-a2fc-40d10a3e5303","Type":"ContainerStarted","Data":"9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06"} Dec 16 09:08:28 crc kubenswrapper[4789]: I1216 09:08:28.619629 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txsmb" podStartSLOduration=3.126839979 podStartE2EDuration="5.619608478s" podCreationTimestamp="2025-12-16 09:08:23 +0000 UTC" firstStartedPulling="2025-12-16 09:08:25.5642221 +0000 UTC m=+8243.826109729" lastFinishedPulling="2025-12-16 09:08:28.056990599 +0000 UTC m=+8246.318878228" observedRunningTime="2025-12-16 09:08:28.614234007 +0000 UTC m=+8246.876121636" watchObservedRunningTime="2025-12-16 09:08:28.619608478 +0000 UTC m=+8246.881496107" Dec 16 09:08:34 crc kubenswrapper[4789]: I1216 09:08:34.364876 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:34 crc kubenswrapper[4789]: I1216 09:08:34.365368 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:34 crc kubenswrapper[4789]: I1216 09:08:34.415794 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:34 crc kubenswrapper[4789]: I1216 09:08:34.690235 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:34 crc kubenswrapper[4789]: I1216 09:08:34.741195 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txsmb"] Dec 16 09:08:36 crc kubenswrapper[4789]: I1216 09:08:36.660977 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-txsmb" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="registry-server" containerID="cri-o://9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06" gracePeriod=2 Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.156364 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.275795 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-utilities\") pod \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.275887 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cjfz\" (UniqueName: \"kubernetes.io/projected/e33daf95-8852-4a9d-a2fc-40d10a3e5303-kube-api-access-9cjfz\") pod \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.276017 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-catalog-content\") pod \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\" (UID: \"e33daf95-8852-4a9d-a2fc-40d10a3e5303\") " Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.276838 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-utilities" (OuterVolumeSpecName: "utilities") pod "e33daf95-8852-4a9d-a2fc-40d10a3e5303" (UID: "e33daf95-8852-4a9d-a2fc-40d10a3e5303"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.282827 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e33daf95-8852-4a9d-a2fc-40d10a3e5303-kube-api-access-9cjfz" (OuterVolumeSpecName: "kube-api-access-9cjfz") pod "e33daf95-8852-4a9d-a2fc-40d10a3e5303" (UID: "e33daf95-8852-4a9d-a2fc-40d10a3e5303"). InnerVolumeSpecName "kube-api-access-9cjfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.296172 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e33daf95-8852-4a9d-a2fc-40d10a3e5303" (UID: "e33daf95-8852-4a9d-a2fc-40d10a3e5303"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.379227 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.379301 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cjfz\" (UniqueName: \"kubernetes.io/projected/e33daf95-8852-4a9d-a2fc-40d10a3e5303-kube-api-access-9cjfz\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.379316 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e33daf95-8852-4a9d-a2fc-40d10a3e5303-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.676283 4789 generic.go:334] "Generic (PLEG): container finished" podID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerID="9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06" exitCode=0 Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.676359 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txsmb" event={"ID":"e33daf95-8852-4a9d-a2fc-40d10a3e5303","Type":"ContainerDied","Data":"9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06"} Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.676421 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txsmb" event={"ID":"e33daf95-8852-4a9d-a2fc-40d10a3e5303","Type":"ContainerDied","Data":"7f3c6a4a8b8d5b8689efb32b77ee833aa9a7f503ea3ed07c9a090f57920c4ef9"} Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.676449 4789 scope.go:117] "RemoveContainer" containerID="9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.676360 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txsmb" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.715208 4789 scope.go:117] "RemoveContainer" containerID="91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.723722 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txsmb"] Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.737265 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-txsmb"] Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.741330 4789 scope.go:117] "RemoveContainer" containerID="8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.790135 4789 scope.go:117] "RemoveContainer" containerID="9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06" Dec 16 09:08:37 crc kubenswrapper[4789]: E1216 09:08:37.790633 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06\": container with ID starting with 9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06 not found: ID does not exist" containerID="9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.790683 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06"} err="failed to get container status \"9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06\": rpc error: code = NotFound desc = could not find container \"9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06\": container with ID starting with 9e19b5e8d305a5b55e931227fdc23dc91ca3e8cb6e688faffc16a91a69175d06 not found: ID does not exist" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.790721 4789 scope.go:117] "RemoveContainer" containerID="91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382" Dec 16 09:08:37 crc kubenswrapper[4789]: E1216 09:08:37.791218 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382\": container with ID starting with 91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382 not found: ID does not exist" containerID="91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.791365 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382"} err="failed to get container status \"91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382\": rpc error: code = NotFound desc = could not find container \"91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382\": container with ID starting with 91b770a7d0eb88e6ae38c78abed861eba6e4268b3e4cd5a40157d016fe0f9382 not found: ID does not exist" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.791467 4789 scope.go:117] "RemoveContainer" containerID="8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560" Dec 16 09:08:37 crc kubenswrapper[4789]: E1216 09:08:37.791950 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560\": container with ID starting with 8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560 not found: ID does not exist" containerID="8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560" Dec 16 09:08:37 crc kubenswrapper[4789]: I1216 09:08:37.792027 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560"} err="failed to get container status \"8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560\": rpc error: code = NotFound desc = could not find container \"8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560\": container with ID starting with 8e3002bce0d5ac93b490c90f9f290d4f12508cf07ad8787dcfba3cab99d17560 not found: ID does not exist" Dec 16 09:08:38 crc kubenswrapper[4789]: I1216 09:08:38.116465 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" path="/var/lib/kubelet/pods/e33daf95-8852-4a9d-a2fc-40d10a3e5303/volumes" Dec 16 09:08:40 crc kubenswrapper[4789]: I1216 09:08:40.105314 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:08:40 crc kubenswrapper[4789]: E1216 09:08:40.105904 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:08:55 crc kubenswrapper[4789]: I1216 09:08:55.104677 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:08:55 crc kubenswrapper[4789]: E1216 09:08:55.105471 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:09:03 crc kubenswrapper[4789]: I1216 09:09:03.908436 4789 generic.go:334] "Generic (PLEG): container finished" podID="ffb74c16-ff91-4d62-a5ea-c381a02c0768" containerID="55980e2e0e75a0f09878d29b3ee686706270fdeb91c770ae8832c5ae65ebe628" exitCode=0 Dec 16 09:09:03 crc kubenswrapper[4789]: I1216 09:09:03.908524 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" event={"ID":"ffb74c16-ff91-4d62-a5ea-c381a02c0768","Type":"ContainerDied","Data":"55980e2e0e75a0f09878d29b3ee686706270fdeb91c770ae8832c5ae65ebe628"} Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.348883 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.540473 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-inventory\") pod \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.540599 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ssh-key\") pod \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.540633 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-combined-ca-bundle\") pod \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.540679 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ceph\") pod \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.540738 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-agent-neutron-config-0\") pod \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.540819 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5q52\" (UniqueName: \"kubernetes.io/projected/ffb74c16-ff91-4d62-a5ea-c381a02c0768-kube-api-access-h5q52\") pod \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\" (UID: \"ffb74c16-ff91-4d62-a5ea-c381a02c0768\") " Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.546419 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb74c16-ff91-4d62-a5ea-c381a02c0768-kube-api-access-h5q52" (OuterVolumeSpecName: "kube-api-access-h5q52") pod "ffb74c16-ff91-4d62-a5ea-c381a02c0768" (UID: "ffb74c16-ff91-4d62-a5ea-c381a02c0768"). InnerVolumeSpecName "kube-api-access-h5q52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.547611 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ceph" (OuterVolumeSpecName: "ceph") pod "ffb74c16-ff91-4d62-a5ea-c381a02c0768" (UID: "ffb74c16-ff91-4d62-a5ea-c381a02c0768"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.551821 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "ffb74c16-ff91-4d62-a5ea-c381a02c0768" (UID: "ffb74c16-ff91-4d62-a5ea-c381a02c0768"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.572347 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffb74c16-ff91-4d62-a5ea-c381a02c0768" (UID: "ffb74c16-ff91-4d62-a5ea-c381a02c0768"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.572769 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "ffb74c16-ff91-4d62-a5ea-c381a02c0768" (UID: "ffb74c16-ff91-4d62-a5ea-c381a02c0768"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.579964 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-inventory" (OuterVolumeSpecName: "inventory") pod "ffb74c16-ff91-4d62-a5ea-c381a02c0768" (UID: "ffb74c16-ff91-4d62-a5ea-c381a02c0768"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.642868 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.643245 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.643256 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.643267 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.643278 4789 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ffb74c16-ff91-4d62-a5ea-c381a02c0768-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.643291 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5q52\" (UniqueName: \"kubernetes.io/projected/ffb74c16-ff91-4d62-a5ea-c381a02c0768-kube-api-access-h5q52\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.928585 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" event={"ID":"ffb74c16-ff91-4d62-a5ea-c381a02c0768","Type":"ContainerDied","Data":"53ffc4bed7f1fbffa71ac65fbca28952cd0222b38d99f90a036845479de8949b"} Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.928645 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53ffc4bed7f1fbffa71ac65fbca28952cd0222b38d99f90a036845479de8949b" Dec 16 09:09:05 crc kubenswrapper[4789]: I1216 09:09:05.928688 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-h74df" Dec 16 09:09:06 crc kubenswrapper[4789]: I1216 09:09:06.106091 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:09:06 crc kubenswrapper[4789]: E1216 09:09:06.106350 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:09:19 crc kubenswrapper[4789]: I1216 09:09:19.105540 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:09:19 crc kubenswrapper[4789]: E1216 09:09:19.106354 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:09:34 crc kubenswrapper[4789]: I1216 09:09:34.105570 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:09:34 crc kubenswrapper[4789]: E1216 09:09:34.106790 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.282531 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.282742 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="70802626-e689-4f46-b3e4-5c2b74cec5bb" containerName="nova-cell0-conductor-conductor" containerID="cri-o://bee44f846086f1f8459648f0221132f8bd82d20a15e603a91dd34127e94c38f4" gracePeriod=30 Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.316952 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.317189 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e68d94f6-4320-41cf-a86c-6ad140e2773a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44" gracePeriod=30 Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.928880 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.929662 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a38e49bf-c8bd-4581-81d8-04c735d9e281" containerName="nova-scheduler-scheduler" containerID="cri-o://2b85906693eee84dbe4f22bc3650497471e1fcfa4651ef649bf3a8b85b3831a4" gracePeriod=30 Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.940603 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.941037 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-api" containerID="cri-o://86c3b03baaccb18ae314a191d2fa45e7d1371b9bc7babc5165b6d3913214685c" gracePeriod=30 Dec 16 09:09:35 crc kubenswrapper[4789]: I1216 09:09:35.941301 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-log" containerID="cri-o://a4a1670d1a87a6d2ce3aaf61446e37348c93feb11559768fe701c88f1758b7f6" gracePeriod=30 Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.028291 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.028567 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-log" containerID="cri-o://4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c" gracePeriod=30 Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.029031 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-metadata" containerID="cri-o://1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277" gracePeriod=30 Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.218028 4789 generic.go:334] "Generic (PLEG): container finished" podID="83d2253f-62e3-4a40-a1ff-66802515e914" containerID="4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c" exitCode=143 Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.218101 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83d2253f-62e3-4a40-a1ff-66802515e914","Type":"ContainerDied","Data":"4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c"} Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.220361 4789 generic.go:334] "Generic (PLEG): container finished" podID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerID="a4a1670d1a87a6d2ce3aaf61446e37348c93feb11559768fe701c88f1758b7f6" exitCode=143 Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.220411 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6465ba6-5f85-4b46-baea-61349bea2e86","Type":"ContainerDied","Data":"a4a1670d1a87a6d2ce3aaf61446e37348c93feb11559768fe701c88f1758b7f6"} Dec 16 09:09:36 crc kubenswrapper[4789]: I1216 09:09:36.934757 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.055045 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-combined-ca-bundle\") pod \"e68d94f6-4320-41cf-a86c-6ad140e2773a\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.055666 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-config-data\") pod \"e68d94f6-4320-41cf-a86c-6ad140e2773a\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.055828 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8bvp\" (UniqueName: \"kubernetes.io/projected/e68d94f6-4320-41cf-a86c-6ad140e2773a-kube-api-access-p8bvp\") pod \"e68d94f6-4320-41cf-a86c-6ad140e2773a\" (UID: \"e68d94f6-4320-41cf-a86c-6ad140e2773a\") " Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.068283 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e68d94f6-4320-41cf-a86c-6ad140e2773a-kube-api-access-p8bvp" (OuterVolumeSpecName: "kube-api-access-p8bvp") pod "e68d94f6-4320-41cf-a86c-6ad140e2773a" (UID: "e68d94f6-4320-41cf-a86c-6ad140e2773a"). InnerVolumeSpecName "kube-api-access-p8bvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.100703 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-config-data" (OuterVolumeSpecName: "config-data") pod "e68d94f6-4320-41cf-a86c-6ad140e2773a" (UID: "e68d94f6-4320-41cf-a86c-6ad140e2773a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.115310 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e68d94f6-4320-41cf-a86c-6ad140e2773a" (UID: "e68d94f6-4320-41cf-a86c-6ad140e2773a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.157789 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.157828 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8bvp\" (UniqueName: \"kubernetes.io/projected/e68d94f6-4320-41cf-a86c-6ad140e2773a-kube-api-access-p8bvp\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.157841 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68d94f6-4320-41cf-a86c-6ad140e2773a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.236623 4789 generic.go:334] "Generic (PLEG): container finished" podID="e68d94f6-4320-41cf-a86c-6ad140e2773a" containerID="d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44" exitCode=0 Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.236678 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e68d94f6-4320-41cf-a86c-6ad140e2773a","Type":"ContainerDied","Data":"d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44"} Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.236716 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e68d94f6-4320-41cf-a86c-6ad140e2773a","Type":"ContainerDied","Data":"a80c342571c17d50a7506be7b38903b1a1b318c5020a6d22d9529894994fd7a0"} Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.236740 4789 scope.go:117] "RemoveContainer" containerID="d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.238626 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.268434 4789 scope.go:117] "RemoveContainer" containerID="d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44" Dec 16 09:09:37 crc kubenswrapper[4789]: E1216 09:09:37.274053 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44\": container with ID starting with d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44 not found: ID does not exist" containerID="d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.274098 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44"} err="failed to get container status \"d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44\": rpc error: code = NotFound desc = could not find container \"d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44\": container with ID starting with d29f2192f446a7c79da0a020ef8d8b7721c7da55fb12379fa196fae889759f44 not found: ID does not exist" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.283120 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.294025 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.314040 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:09:37 crc kubenswrapper[4789]: E1216 09:09:37.314687 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68d94f6-4320-41cf-a86c-6ad140e2773a" containerName="nova-cell1-conductor-conductor" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.314709 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68d94f6-4320-41cf-a86c-6ad140e2773a" containerName="nova-cell1-conductor-conductor" Dec 16 09:09:37 crc kubenswrapper[4789]: E1216 09:09:37.314722 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb74c16-ff91-4d62-a5ea-c381a02c0768" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.314730 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb74c16-ff91-4d62-a5ea-c381a02c0768" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 16 09:09:37 crc kubenswrapper[4789]: E1216 09:09:37.314753 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="extract-content" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.314761 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="extract-content" Dec 16 09:09:37 crc kubenswrapper[4789]: E1216 09:09:37.314781 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="extract-utilities" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.314787 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="extract-utilities" Dec 16 09:09:37 crc kubenswrapper[4789]: E1216 09:09:37.314802 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="registry-server" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.314808 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="registry-server" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.315050 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb74c16-ff91-4d62-a5ea-c381a02c0768" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.315075 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e68d94f6-4320-41cf-a86c-6ad140e2773a" containerName="nova-cell1-conductor-conductor" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.315090 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="e33daf95-8852-4a9d-a2fc-40d10a3e5303" containerName="registry-server" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.316578 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.319130 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.338813 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.467650 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w2k7\" (UniqueName: \"kubernetes.io/projected/a7493e41-dae1-4d90-a734-fda98dd32937-kube-api-access-5w2k7\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.468091 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7493e41-dae1-4d90-a734-fda98dd32937-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.468189 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7493e41-dae1-4d90-a734-fda98dd32937-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.571345 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w2k7\" (UniqueName: \"kubernetes.io/projected/a7493e41-dae1-4d90-a734-fda98dd32937-kube-api-access-5w2k7\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.571560 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7493e41-dae1-4d90-a734-fda98dd32937-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.571602 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7493e41-dae1-4d90-a734-fda98dd32937-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.576464 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7493e41-dae1-4d90-a734-fda98dd32937-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.584348 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7493e41-dae1-4d90-a734-fda98dd32937-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.587327 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w2k7\" (UniqueName: \"kubernetes.io/projected/a7493e41-dae1-4d90-a734-fda98dd32937-kube-api-access-5w2k7\") pod \"nova-cell1-conductor-0\" (UID: \"a7493e41-dae1-4d90-a734-fda98dd32937\") " pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:37 crc kubenswrapper[4789]: I1216 09:09:37.634872 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:38 crc kubenswrapper[4789]: I1216 09:09:38.121979 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e68d94f6-4320-41cf-a86c-6ad140e2773a" path="/var/lib/kubelet/pods/e68d94f6-4320-41cf-a86c-6ad140e2773a/volumes" Dec 16 09:09:38 crc kubenswrapper[4789]: I1216 09:09:38.142655 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 09:09:38 crc kubenswrapper[4789]: I1216 09:09:38.248067 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a7493e41-dae1-4d90-a734-fda98dd32937","Type":"ContainerStarted","Data":"57f6c74ff856f83640e3421d787e4493612eadc5d33571fd2e1104dcf83614b3"} Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.072028 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": read tcp 10.217.0.2:38268->10.217.1.81:8774: read: connection reset by peer" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.072039 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": read tcp 10.217.0.2:38264->10.217.1.81:8774: read: connection reset by peer" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.229158 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": read tcp 10.217.0.2:60156->10.217.1.82:8775: read: connection reset by peer" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.229172 4789 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": read tcp 10.217.0.2:60142->10.217.1.82:8775: read: connection reset by peer" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.260754 4789 generic.go:334] "Generic (PLEG): container finished" podID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerID="86c3b03baaccb18ae314a191d2fa45e7d1371b9bc7babc5165b6d3913214685c" exitCode=0 Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.260824 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6465ba6-5f85-4b46-baea-61349bea2e86","Type":"ContainerDied","Data":"86c3b03baaccb18ae314a191d2fa45e7d1371b9bc7babc5165b6d3913214685c"} Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.265163 4789 generic.go:334] "Generic (PLEG): container finished" podID="a38e49bf-c8bd-4581-81d8-04c735d9e281" containerID="2b85906693eee84dbe4f22bc3650497471e1fcfa4651ef649bf3a8b85b3831a4" exitCode=0 Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.265225 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a38e49bf-c8bd-4581-81d8-04c735d9e281","Type":"ContainerDied","Data":"2b85906693eee84dbe4f22bc3650497471e1fcfa4651ef649bf3a8b85b3831a4"} Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.266496 4789 generic.go:334] "Generic (PLEG): container finished" podID="70802626-e689-4f46-b3e4-5c2b74cec5bb" containerID="bee44f846086f1f8459648f0221132f8bd82d20a15e603a91dd34127e94c38f4" exitCode=0 Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.266539 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"70802626-e689-4f46-b3e4-5c2b74cec5bb","Type":"ContainerDied","Data":"bee44f846086f1f8459648f0221132f8bd82d20a15e603a91dd34127e94c38f4"} Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.267704 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a7493e41-dae1-4d90-a734-fda98dd32937","Type":"ContainerStarted","Data":"d70a08c8c3ed559e153d4c9aefc8c800bb454f0f867ee631a8c45c6fbdd8c2d4"} Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.268968 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.291879 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.291862914 podStartE2EDuration="2.291862914s" podCreationTimestamp="2025-12-16 09:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:09:39.288645296 +0000 UTC m=+8317.550532915" watchObservedRunningTime="2025-12-16 09:09:39.291862914 +0000 UTC m=+8317.553750543" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.592988 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.702072 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.721305 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fq67\" (UniqueName: \"kubernetes.io/projected/a38e49bf-c8bd-4581-81d8-04c735d9e281-kube-api-access-2fq67\") pod \"a38e49bf-c8bd-4581-81d8-04c735d9e281\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.721613 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-combined-ca-bundle\") pod \"a38e49bf-c8bd-4581-81d8-04c735d9e281\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.721733 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-config-data\") pod \"a38e49bf-c8bd-4581-81d8-04c735d9e281\" (UID: \"a38e49bf-c8bd-4581-81d8-04c735d9e281\") " Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.732433 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a38e49bf-c8bd-4581-81d8-04c735d9e281-kube-api-access-2fq67" (OuterVolumeSpecName: "kube-api-access-2fq67") pod "a38e49bf-c8bd-4581-81d8-04c735d9e281" (UID: "a38e49bf-c8bd-4581-81d8-04c735d9e281"). InnerVolumeSpecName "kube-api-access-2fq67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.757690 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a38e49bf-c8bd-4581-81d8-04c735d9e281" (UID: "a38e49bf-c8bd-4581-81d8-04c735d9e281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.786656 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-config-data" (OuterVolumeSpecName: "config-data") pod "a38e49bf-c8bd-4581-81d8-04c735d9e281" (UID: "a38e49bf-c8bd-4581-81d8-04c735d9e281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.827701 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-config-data\") pod \"70802626-e689-4f46-b3e4-5c2b74cec5bb\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.827960 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-combined-ca-bundle\") pod \"70802626-e689-4f46-b3e4-5c2b74cec5bb\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.828043 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fgdq\" (UniqueName: \"kubernetes.io/projected/70802626-e689-4f46-b3e4-5c2b74cec5bb-kube-api-access-2fgdq\") pod \"70802626-e689-4f46-b3e4-5c2b74cec5bb\" (UID: \"70802626-e689-4f46-b3e4-5c2b74cec5bb\") " Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.828625 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fq67\" (UniqueName: \"kubernetes.io/projected/a38e49bf-c8bd-4581-81d8-04c735d9e281-kube-api-access-2fq67\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.828650 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.828663 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a38e49bf-c8bd-4581-81d8-04c735d9e281-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.837941 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70802626-e689-4f46-b3e4-5c2b74cec5bb-kube-api-access-2fgdq" (OuterVolumeSpecName: "kube-api-access-2fgdq") pod "70802626-e689-4f46-b3e4-5c2b74cec5bb" (UID: "70802626-e689-4f46-b3e4-5c2b74cec5bb"). InnerVolumeSpecName "kube-api-access-2fgdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.849588 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.860108 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70802626-e689-4f46-b3e4-5c2b74cec5bb" (UID: "70802626-e689-4f46-b3e4-5c2b74cec5bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.880687 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-config-data" (OuterVolumeSpecName: "config-data") pod "70802626-e689-4f46-b3e4-5c2b74cec5bb" (UID: "70802626-e689-4f46-b3e4-5c2b74cec5bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.931166 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.931208 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fgdq\" (UniqueName: \"kubernetes.io/projected/70802626-e689-4f46-b3e4-5c2b74cec5bb-kube-api-access-2fgdq\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:39 crc kubenswrapper[4789]: I1216 09:09:39.931221 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70802626-e689-4f46-b3e4-5c2b74cec5bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.032662 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6465ba6-5f85-4b46-baea-61349bea2e86-logs\") pod \"c6465ba6-5f85-4b46-baea-61349bea2e86\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.032742 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-config-data\") pod \"c6465ba6-5f85-4b46-baea-61349bea2e86\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.032851 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-combined-ca-bundle\") pod \"c6465ba6-5f85-4b46-baea-61349bea2e86\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.032937 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jwr4\" (UniqueName: \"kubernetes.io/projected/c6465ba6-5f85-4b46-baea-61349bea2e86-kube-api-access-4jwr4\") pod \"c6465ba6-5f85-4b46-baea-61349bea2e86\" (UID: \"c6465ba6-5f85-4b46-baea-61349bea2e86\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.034005 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6465ba6-5f85-4b46-baea-61349bea2e86-logs" (OuterVolumeSpecName: "logs") pod "c6465ba6-5f85-4b46-baea-61349bea2e86" (UID: "c6465ba6-5f85-4b46-baea-61349bea2e86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.041284 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6465ba6-5f85-4b46-baea-61349bea2e86-kube-api-access-4jwr4" (OuterVolumeSpecName: "kube-api-access-4jwr4") pod "c6465ba6-5f85-4b46-baea-61349bea2e86" (UID: "c6465ba6-5f85-4b46-baea-61349bea2e86"). InnerVolumeSpecName "kube-api-access-4jwr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.090128 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-config-data" (OuterVolumeSpecName: "config-data") pod "c6465ba6-5f85-4b46-baea-61349bea2e86" (UID: "c6465ba6-5f85-4b46-baea-61349bea2e86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.106447 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6465ba6-5f85-4b46-baea-61349bea2e86" (UID: "c6465ba6-5f85-4b46-baea-61349bea2e86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.143609 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6465ba6-5f85-4b46-baea-61349bea2e86-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.143654 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.143667 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6465ba6-5f85-4b46-baea-61349bea2e86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.144507 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jwr4\" (UniqueName: \"kubernetes.io/projected/c6465ba6-5f85-4b46-baea-61349bea2e86-kube-api-access-4jwr4\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.179722 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w"] Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.180177 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70802626-e689-4f46-b3e4-5c2b74cec5bb" containerName="nova-cell0-conductor-conductor" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180203 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="70802626-e689-4f46-b3e4-5c2b74cec5bb" containerName="nova-cell0-conductor-conductor" Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.180215 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a38e49bf-c8bd-4581-81d8-04c735d9e281" containerName="nova-scheduler-scheduler" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180223 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a38e49bf-c8bd-4581-81d8-04c735d9e281" containerName="nova-scheduler-scheduler" Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.180252 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-log" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180258 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-log" Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.180270 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-api" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180276 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-api" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180470 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-api" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180487 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a38e49bf-c8bd-4581-81d8-04c735d9e281" containerName="nova-scheduler-scheduler" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180499 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" containerName="nova-api-log" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.180518 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="70802626-e689-4f46-b3e4-5c2b74cec5bb" containerName="nova-cell0-conductor-conductor" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.181328 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.185585 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-2nclq" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.185638 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.185791 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.186157 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.186524 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.187362 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.187780 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.209739 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.229879 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.245368 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d2253f-62e3-4a40-a1ff-66802515e914-logs\") pod \"83d2253f-62e3-4a40-a1ff-66802515e914\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.245538 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xtk4\" (UniqueName: \"kubernetes.io/projected/83d2253f-62e3-4a40-a1ff-66802515e914-kube-api-access-7xtk4\") pod \"83d2253f-62e3-4a40-a1ff-66802515e914\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.245640 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-config-data\") pod \"83d2253f-62e3-4a40-a1ff-66802515e914\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.245712 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-combined-ca-bundle\") pod \"83d2253f-62e3-4a40-a1ff-66802515e914\" (UID: \"83d2253f-62e3-4a40-a1ff-66802515e914\") " Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.245987 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83d2253f-62e3-4a40-a1ff-66802515e914-logs" (OuterVolumeSpecName: "logs") pod "83d2253f-62e3-4a40-a1ff-66802515e914" (UID: "83d2253f-62e3-4a40-a1ff-66802515e914"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.246084 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.246702 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4zt6\" (UniqueName: \"kubernetes.io/projected/12ce9e20-a637-474c-862b-a8c47381fda9-kube-api-access-m4zt6\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.246798 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.246899 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247228 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247317 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247352 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247402 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247482 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247614 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247756 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.247836 4789 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d2253f-62e3-4a40-a1ff-66802515e914-logs\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.278339 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d2253f-62e3-4a40-a1ff-66802515e914-kube-api-access-7xtk4" (OuterVolumeSpecName: "kube-api-access-7xtk4") pod "83d2253f-62e3-4a40-a1ff-66802515e914" (UID: "83d2253f-62e3-4a40-a1ff-66802515e914"). InnerVolumeSpecName "kube-api-access-7xtk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.309222 4789 generic.go:334] "Generic (PLEG): container finished" podID="83d2253f-62e3-4a40-a1ff-66802515e914" containerID="1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277" exitCode=0 Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.309552 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83d2253f-62e3-4a40-a1ff-66802515e914","Type":"ContainerDied","Data":"1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277"} Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.309646 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83d2253f-62e3-4a40-a1ff-66802515e914","Type":"ContainerDied","Data":"0ec106db53e5ece5d29ef80eef15879e737dcb015db2784c9e67cada7a677743"} Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.309730 4789 scope.go:117] "RemoveContainer" containerID="1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.310759 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.340345 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6465ba6-5f85-4b46-baea-61349bea2e86","Type":"ContainerDied","Data":"90fa201ace39bf0e8255f479d8bb34a1671bb37eab994a10646dac961568e1b7"} Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.340529 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.351782 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.351848 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.351973 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352114 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352193 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352283 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4zt6\" (UniqueName: \"kubernetes.io/projected/12ce9e20-a637-474c-862b-a8c47381fda9-kube-api-access-m4zt6\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352353 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352417 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352526 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352615 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352652 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.352750 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xtk4\" (UniqueName: \"kubernetes.io/projected/83d2253f-62e3-4a40-a1ff-66802515e914-kube-api-access-7xtk4\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.359558 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a38e49bf-c8bd-4581-81d8-04c735d9e281","Type":"ContainerDied","Data":"52c4399ccd9797a0497da8e4f70bc65e0d6d91e69ead1f774ca29d7de9756588"} Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.359664 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.359788 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83d2253f-62e3-4a40-a1ff-66802515e914" (UID: "83d2253f-62e3-4a40-a1ff-66802515e914"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.363045 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-config-data" (OuterVolumeSpecName: "config-data") pod "83d2253f-62e3-4a40-a1ff-66802515e914" (UID: "83d2253f-62e3-4a40-a1ff-66802515e914"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.364295 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.364970 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.367264 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.368747 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.369291 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.370044 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"70802626-e689-4f46-b3e4-5c2b74cec5bb","Type":"ContainerDied","Data":"e97bfa9565512888c35589a2484263e9c8adc7ad3ef33fe72c1d709e676171e8"} Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.376680 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.383540 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.389036 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.401951 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.415629 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4zt6\" (UniqueName: \"kubernetes.io/projected/12ce9e20-a637-474c-862b-a8c47381fda9-kube-api-access-m4zt6\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.417442 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.448502 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.454413 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.454455 4789 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d2253f-62e3-4a40-a1ff-66802515e914-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.462054 4789 scope.go:117] "RemoveContainer" containerID="4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.523524 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.533853 4789 scope.go:117] "RemoveContainer" containerID="1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277" Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.537222 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277\": container with ID starting with 1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277 not found: ID does not exist" containerID="1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.537304 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277"} err="failed to get container status \"1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277\": rpc error: code = NotFound desc = could not find container \"1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277\": container with ID starting with 1a44e22fbcd7b2e11b19292ac31f22854a21714c3ca737a55f58b86aafdf2277 not found: ID does not exist" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.537336 4789 scope.go:117] "RemoveContainer" containerID="4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c" Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.541124 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c\": container with ID starting with 4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c not found: ID does not exist" containerID="4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.541197 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c"} err="failed to get container status \"4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c\": rpc error: code = NotFound desc = could not find container \"4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c\": container with ID starting with 4ef7d7f76dd3639f52775f40b277bfe702d60c8abecc6af531643285ea315a2c not found: ID does not exist" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.541229 4789 scope.go:117] "RemoveContainer" containerID="86c3b03baaccb18ae314a191d2fa45e7d1371b9bc7babc5165b6d3913214685c" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.554474 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.573768 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.600658 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.619225 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.629984 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.630797 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-log" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.630825 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-log" Dec 16 09:09:40 crc kubenswrapper[4789]: E1216 09:09:40.630834 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-metadata" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.630887 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-metadata" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.631133 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-log" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.631157 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" containerName="nova-metadata-metadata" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.639280 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.645339 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.645445 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.661359 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d99xf\" (UniqueName: \"kubernetes.io/projected/a7cb033e-87a1-40d6-bf9f-0be6422d7493-kube-api-access-d99xf\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.661423 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cb033e-87a1-40d6-bf9f-0be6422d7493-config-data\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.661511 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cb033e-87a1-40d6-bf9f-0be6422d7493-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.661657 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cb033e-87a1-40d6-bf9f-0be6422d7493-logs\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.667000 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.676292 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.677737 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.687288 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.689993 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.705688 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.721848 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.723324 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.729353 4789 scope.go:117] "RemoveContainer" containerID="a4a1670d1a87a6d2ce3aaf61446e37348c93feb11559768fe701c88f1758b7f6" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.731469 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.737988 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.763752 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cb033e-87a1-40d6-bf9f-0be6422d7493-logs\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.764633 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d99xf\" (UniqueName: \"kubernetes.io/projected/a7cb033e-87a1-40d6-bf9f-0be6422d7493-kube-api-access-d99xf\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.764764 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cb033e-87a1-40d6-bf9f-0be6422d7493-config-data\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.764962 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftnb8\" (UniqueName: \"kubernetes.io/projected/e404e342-e901-43fc-9652-6c4c67a65469-kube-api-access-ftnb8\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.765118 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cb033e-87a1-40d6-bf9f-0be6422d7493-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.765239 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e404e342-e901-43fc-9652-6c4c67a65469-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.765353 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e404e342-e901-43fc-9652-6c4c67a65469-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.764596 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cb033e-87a1-40d6-bf9f-0be6422d7493-logs\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.771589 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cb033e-87a1-40d6-bf9f-0be6422d7493-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.780353 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.787238 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cb033e-87a1-40d6-bf9f-0be6422d7493-config-data\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.806485 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d99xf\" (UniqueName: \"kubernetes.io/projected/a7cb033e-87a1-40d6-bf9f-0be6422d7493-kube-api-access-d99xf\") pod \"nova-api-0\" (UID: \"a7cb033e-87a1-40d6-bf9f-0be6422d7493\") " pod="openstack/nova-api-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.815858 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.815934 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.817668 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.831126 4789 scope.go:117] "RemoveContainer" containerID="2b85906693eee84dbe4f22bc3650497471e1fcfa4651ef649bf3a8b85b3831a4" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.831569 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.835012 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.871486 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38d7e41-cc58-418e-8988-969ed80309c0-config-data\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.871641 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftnb8\" (UniqueName: \"kubernetes.io/projected/e404e342-e901-43fc-9652-6c4c67a65469-kube-api-access-ftnb8\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.871714 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38d7e41-cc58-418e-8988-969ed80309c0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.871755 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e404e342-e901-43fc-9652-6c4c67a65469-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.871782 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e404e342-e901-43fc-9652-6c4c67a65469-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.871823 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sd7z\" (UniqueName: \"kubernetes.io/projected/c38d7e41-cc58-418e-8988-969ed80309c0-kube-api-access-4sd7z\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.876623 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e404e342-e901-43fc-9652-6c4c67a65469-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.891074 4789 scope.go:117] "RemoveContainer" containerID="bee44f846086f1f8459648f0221132f8bd82d20a15e603a91dd34127e94c38f4" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.891393 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftnb8\" (UniqueName: \"kubernetes.io/projected/e404e342-e901-43fc-9652-6c4c67a65469-kube-api-access-ftnb8\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.891717 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e404e342-e901-43fc-9652-6c4c67a65469-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e404e342-e901-43fc-9652-6c4c67a65469\") " pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.973711 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38d7e41-cc58-418e-8988-969ed80309c0-config-data\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.973790 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-config-data\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.973826 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.973846 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-logs\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.973867 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqk6\" (UniqueName: \"kubernetes.io/projected/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-kube-api-access-sjqk6\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.974246 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38d7e41-cc58-418e-8988-969ed80309c0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.974416 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sd7z\" (UniqueName: \"kubernetes.io/projected/c38d7e41-cc58-418e-8988-969ed80309c0-kube-api-access-4sd7z\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.978319 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c38d7e41-cc58-418e-8988-969ed80309c0-config-data\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.979564 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c38d7e41-cc58-418e-8988-969ed80309c0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:40 crc kubenswrapper[4789]: I1216 09:09:40.992314 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sd7z\" (UniqueName: \"kubernetes.io/projected/c38d7e41-cc58-418e-8988-969ed80309c0-kube-api-access-4sd7z\") pod \"nova-scheduler-0\" (UID: \"c38d7e41-cc58-418e-8988-969ed80309c0\") " pod="openstack/nova-scheduler-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.029815 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.061078 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.083900 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-config-data\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.083989 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.084017 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-logs\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.084035 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqk6\" (UniqueName: \"kubernetes.io/projected/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-kube-api-access-sjqk6\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.086399 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-logs\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.086657 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.088964 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.094349 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-config-data\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.112820 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqk6\" (UniqueName: \"kubernetes.io/projected/b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791-kube-api-access-sjqk6\") pod \"nova-metadata-0\" (UID: \"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791\") " pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.157555 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 09:09:41 crc kubenswrapper[4789]: W1216 09:09:41.391337 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12ce9e20_a637_474c_862b_a8c47381fda9.slice/crio-6e2929244e62d13ab487493f822ba94163f926b9d1da315bc5fa48896c340e42 WatchSource:0}: Error finding container 6e2929244e62d13ab487493f822ba94163f926b9d1da315bc5fa48896c340e42: Status 404 returned error can't find the container with id 6e2929244e62d13ab487493f822ba94163f926b9d1da315bc5fa48896c340e42 Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.393767 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w"] Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.395329 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.616550 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 09:09:41 crc kubenswrapper[4789]: W1216 09:09:41.617706 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7cb033e_87a1_40d6_bf9f_0be6422d7493.slice/crio-ecd75bec95042785c767a82f57ef90a93d5ccf110e557f60c53444b4120976d3 WatchSource:0}: Error finding container ecd75bec95042785c767a82f57ef90a93d5ccf110e557f60c53444b4120976d3: Status 404 returned error can't find the container with id ecd75bec95042785c767a82f57ef90a93d5ccf110e557f60c53444b4120976d3 Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.686727 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 09:09:41 crc kubenswrapper[4789]: W1216 09:09:41.777940 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3e8043f_1e4d_41a1_92f6_9cfa3ad8b791.slice/crio-1b770632b0f09556d456723318021a2fd461c47411b407c3cb58c40a5f2e8a9b WatchSource:0}: Error finding container 1b770632b0f09556d456723318021a2fd461c47411b407c3cb58c40a5f2e8a9b: Status 404 returned error can't find the container with id 1b770632b0f09556d456723318021a2fd461c47411b407c3cb58c40a5f2e8a9b Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.784100 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 09:09:41 crc kubenswrapper[4789]: I1216 09:09:41.794248 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.125943 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70802626-e689-4f46-b3e4-5c2b74cec5bb" path="/var/lib/kubelet/pods/70802626-e689-4f46-b3e4-5c2b74cec5bb/volumes" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.127115 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d2253f-62e3-4a40-a1ff-66802515e914" path="/var/lib/kubelet/pods/83d2253f-62e3-4a40-a1ff-66802515e914/volumes" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.127811 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a38e49bf-c8bd-4581-81d8-04c735d9e281" path="/var/lib/kubelet/pods/a38e49bf-c8bd-4581-81d8-04c735d9e281/volumes" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.129201 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6465ba6-5f85-4b46-baea-61349bea2e86" path="/var/lib/kubelet/pods/c6465ba6-5f85-4b46-baea-61349bea2e86/volumes" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.420205 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e404e342-e901-43fc-9652-6c4c67a65469","Type":"ContainerStarted","Data":"10e5420412119aff1bbbe700501c843cc32e07428d984d24b40f78f48d5968e0"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.420572 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e404e342-e901-43fc-9652-6c4c67a65469","Type":"ContainerStarted","Data":"9f8dbfb582c8847221d5bac7e9dd95318e6f380b47bccc54aa9cb6b24aa62db2"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.421544 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.434623 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7cb033e-87a1-40d6-bf9f-0be6422d7493","Type":"ContainerStarted","Data":"73bb58d12cd986ed0fabbf02565a4f16836375117c68c99841da26ea237bb7bb"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.434681 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7cb033e-87a1-40d6-bf9f-0be6422d7493","Type":"ContainerStarted","Data":"14c71907be859c271cc93c602c87579911b6e706782a8d7c890ba59760240903"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.434693 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7cb033e-87a1-40d6-bf9f-0be6422d7493","Type":"ContainerStarted","Data":"ecd75bec95042785c767a82f57ef90a93d5ccf110e557f60c53444b4120976d3"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.441264 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.441246152 podStartE2EDuration="2.441246152s" podCreationTimestamp="2025-12-16 09:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:09:42.440506074 +0000 UTC m=+8320.702393703" watchObservedRunningTime="2025-12-16 09:09:42.441246152 +0000 UTC m=+8320.703133791" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.452279 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c38d7e41-cc58-418e-8988-969ed80309c0","Type":"ContainerStarted","Data":"459def08f5fd83bdb318341dd93c6941d77565663388b8ffca3118b5c52d8dca"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.452320 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c38d7e41-cc58-418e-8988-969ed80309c0","Type":"ContainerStarted","Data":"42b02259346f7305e65701e9006fa0fad1d946d351af52d9fbe5f6bfdd19818f"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.458303 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" event={"ID":"12ce9e20-a637-474c-862b-a8c47381fda9","Type":"ContainerStarted","Data":"039eb5b157f17c194e2e1cc961160655fcdda0bb018511ee3dbeaf1f7563c8a0"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.458441 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" event={"ID":"12ce9e20-a637-474c-862b-a8c47381fda9","Type":"ContainerStarted","Data":"6e2929244e62d13ab487493f822ba94163f926b9d1da315bc5fa48896c340e42"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.462633 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791","Type":"ContainerStarted","Data":"0d9728e0906d5c95540970c6a9c9c56e0bd2678ab13a19701011d86827af895b"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.462671 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791","Type":"ContainerStarted","Data":"ef555255ac27ed5bd8240fb91ae89700b601e914ca0a0e5f86a5a2d4b0098612"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.462683 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791","Type":"ContainerStarted","Data":"1b770632b0f09556d456723318021a2fd461c47411b407c3cb58c40a5f2e8a9b"} Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.476667 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.476646416 podStartE2EDuration="2.476646416s" podCreationTimestamp="2025-12-16 09:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:09:42.470341562 +0000 UTC m=+8320.732229191" watchObservedRunningTime="2025-12-16 09:09:42.476646416 +0000 UTC m=+8320.738534045" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.525802 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5257847350000002 podStartE2EDuration="2.525784735s" podCreationTimestamp="2025-12-16 09:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:09:42.523232682 +0000 UTC m=+8320.785120311" watchObservedRunningTime="2025-12-16 09:09:42.525784735 +0000 UTC m=+8320.787672364" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.526239 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.526233795 podStartE2EDuration="2.526233795s" podCreationTimestamp="2025-12-16 09:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 09:09:42.503197554 +0000 UTC m=+8320.765085193" watchObservedRunningTime="2025-12-16 09:09:42.526233795 +0000 UTC m=+8320.788121414" Dec 16 09:09:42 crc kubenswrapper[4789]: I1216 09:09:42.547739 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" podStartSLOduration=1.978309278 podStartE2EDuration="2.547723889s" podCreationTimestamp="2025-12-16 09:09:40 +0000 UTC" firstStartedPulling="2025-12-16 09:09:41.395089361 +0000 UTC m=+8319.656976990" lastFinishedPulling="2025-12-16 09:09:41.964503972 +0000 UTC m=+8320.226391601" observedRunningTime="2025-12-16 09:09:42.544947651 +0000 UTC m=+8320.806835280" watchObservedRunningTime="2025-12-16 09:09:42.547723889 +0000 UTC m=+8320.809611518" Dec 16 09:09:46 crc kubenswrapper[4789]: I1216 09:09:46.062074 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 09:09:46 crc kubenswrapper[4789]: I1216 09:09:46.131294 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 09:09:46 crc kubenswrapper[4789]: I1216 09:09:46.163971 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 09:09:46 crc kubenswrapper[4789]: I1216 09:09:46.164429 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 09:09:47 crc kubenswrapper[4789]: I1216 09:09:47.105546 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:09:47 crc kubenswrapper[4789]: E1216 09:09:47.105832 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:09:47 crc kubenswrapper[4789]: I1216 09:09:47.661860 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 09:09:51 crc kubenswrapper[4789]: I1216 09:09:51.030286 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:09:51 crc kubenswrapper[4789]: I1216 09:09:51.031788 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 09:09:51 crc kubenswrapper[4789]: I1216 09:09:51.061676 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 09:09:51 crc kubenswrapper[4789]: I1216 09:09:51.089230 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 09:09:51 crc kubenswrapper[4789]: I1216 09:09:51.158398 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 09:09:51 crc kubenswrapper[4789]: I1216 09:09:51.158465 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 09:09:51 crc kubenswrapper[4789]: I1216 09:09:51.611230 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 09:09:52 crc kubenswrapper[4789]: I1216 09:09:52.113143 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7cb033e-87a1-40d6-bf9f-0be6422d7493" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.173:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:09:52 crc kubenswrapper[4789]: I1216 09:09:52.113956 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7cb033e-87a1-40d6-bf9f-0be6422d7493" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.173:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:09:52 crc kubenswrapper[4789]: I1216 09:09:52.241099 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.176:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:09:52 crc kubenswrapper[4789]: I1216 09:09:52.241371 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.176:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.034320 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.035335 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.037262 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.037840 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.104400 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:10:01 crc kubenswrapper[4789]: E1216 09:10:01.104674 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.160068 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.160261 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.163608 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.164148 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.685537 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 09:10:01 crc kubenswrapper[4789]: I1216 09:10:01.688992 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 09:10:15 crc kubenswrapper[4789]: I1216 09:10:15.106043 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:10:15 crc kubenswrapper[4789]: E1216 09:10:15.107326 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:10:29 crc kubenswrapper[4789]: I1216 09:10:29.105763 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:10:29 crc kubenswrapper[4789]: E1216 09:10:29.106601 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:10:43 crc kubenswrapper[4789]: I1216 09:10:43.105529 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:10:43 crc kubenswrapper[4789]: E1216 09:10:43.106308 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:10:55 crc kubenswrapper[4789]: I1216 09:10:55.105532 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:10:55 crc kubenswrapper[4789]: E1216 09:10:55.106314 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:11:09 crc kubenswrapper[4789]: I1216 09:11:09.104651 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:11:09 crc kubenswrapper[4789]: E1216 09:11:09.105475 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:11:24 crc kubenswrapper[4789]: I1216 09:11:24.105658 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:11:24 crc kubenswrapper[4789]: E1216 09:11:24.106727 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:11:37 crc kubenswrapper[4789]: I1216 09:11:37.104699 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:11:37 crc kubenswrapper[4789]: E1216 09:11:37.105620 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.578063 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s52t7"] Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.580964 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.588361 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s52t7"] Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.643838 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-catalog-content\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.644236 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-utilities\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.644415 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5wxw\" (UniqueName: \"kubernetes.io/projected/627329dc-9883-4caa-99b6-31cf32666e8c-kube-api-access-s5wxw\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.746721 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-utilities\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.746796 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5wxw\" (UniqueName: \"kubernetes.io/projected/627329dc-9883-4caa-99b6-31cf32666e8c-kube-api-access-s5wxw\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.746911 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-catalog-content\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.747493 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-catalog-content\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.747901 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-utilities\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.768887 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5wxw\" (UniqueName: \"kubernetes.io/projected/627329dc-9883-4caa-99b6-31cf32666e8c-kube-api-access-s5wxw\") pod \"redhat-operators-s52t7\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:38 crc kubenswrapper[4789]: I1216 09:11:38.911512 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:39 crc kubenswrapper[4789]: I1216 09:11:39.445579 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s52t7"] Dec 16 09:11:39 crc kubenswrapper[4789]: I1216 09:11:39.605008 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s52t7" event={"ID":"627329dc-9883-4caa-99b6-31cf32666e8c","Type":"ContainerStarted","Data":"25f8554ad8c633c542d7add046cb18067e1997fdcbe613cc1f212675b4b6eec0"} Dec 16 09:11:40 crc kubenswrapper[4789]: I1216 09:11:40.619582 4789 generic.go:334] "Generic (PLEG): container finished" podID="627329dc-9883-4caa-99b6-31cf32666e8c" containerID="976b50e69e7716e5115061095d59a521358d3b130fe3781f8dedd40ccbce2740" exitCode=0 Dec 16 09:11:40 crc kubenswrapper[4789]: I1216 09:11:40.619662 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s52t7" event={"ID":"627329dc-9883-4caa-99b6-31cf32666e8c","Type":"ContainerDied","Data":"976b50e69e7716e5115061095d59a521358d3b130fe3781f8dedd40ccbce2740"} Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.576710 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g4z5k"] Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.580178 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.588078 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g4z5k"] Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.615617 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-catalog-content\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.616056 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js7q9\" (UniqueName: \"kubernetes.io/projected/2be95354-5dbf-401d-9ff0-587d67d57030-kube-api-access-js7q9\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.616154 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-utilities\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.717416 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js7q9\" (UniqueName: \"kubernetes.io/projected/2be95354-5dbf-401d-9ff0-587d67d57030-kube-api-access-js7q9\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.717453 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-utilities\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.717572 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-catalog-content\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.718027 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-utilities\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.718082 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-catalog-content\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.737081 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js7q9\" (UniqueName: \"kubernetes.io/projected/2be95354-5dbf-401d-9ff0-587d67d57030-kube-api-access-js7q9\") pod \"certified-operators-g4z5k\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:41 crc kubenswrapper[4789]: I1216 09:11:41.906314 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:42 crc kubenswrapper[4789]: I1216 09:11:42.569829 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g4z5k"] Dec 16 09:11:42 crc kubenswrapper[4789]: W1216 09:11:42.573603 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be95354_5dbf_401d_9ff0_587d67d57030.slice/crio-eb4003b9a0f123ef6cba32b0904654312f67c152a55d06bd4c6bc46d28e37012 WatchSource:0}: Error finding container eb4003b9a0f123ef6cba32b0904654312f67c152a55d06bd4c6bc46d28e37012: Status 404 returned error can't find the container with id eb4003b9a0f123ef6cba32b0904654312f67c152a55d06bd4c6bc46d28e37012 Dec 16 09:11:42 crc kubenswrapper[4789]: I1216 09:11:42.642323 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s52t7" event={"ID":"627329dc-9883-4caa-99b6-31cf32666e8c","Type":"ContainerStarted","Data":"8504a1ecb2cf2d1b6171fc25e3f36998c86f7273d73f6ac2c481f13f85595388"} Dec 16 09:11:42 crc kubenswrapper[4789]: I1216 09:11:42.645274 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4z5k" event={"ID":"2be95354-5dbf-401d-9ff0-587d67d57030","Type":"ContainerStarted","Data":"eb4003b9a0f123ef6cba32b0904654312f67c152a55d06bd4c6bc46d28e37012"} Dec 16 09:11:44 crc kubenswrapper[4789]: I1216 09:11:44.665477 4789 generic.go:334] "Generic (PLEG): container finished" podID="2be95354-5dbf-401d-9ff0-587d67d57030" containerID="5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa" exitCode=0 Dec 16 09:11:44 crc kubenswrapper[4789]: I1216 09:11:44.665604 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4z5k" event={"ID":"2be95354-5dbf-401d-9ff0-587d67d57030","Type":"ContainerDied","Data":"5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa"} Dec 16 09:11:46 crc kubenswrapper[4789]: I1216 09:11:46.686462 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4z5k" event={"ID":"2be95354-5dbf-401d-9ff0-587d67d57030","Type":"ContainerStarted","Data":"98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606"} Dec 16 09:11:47 crc kubenswrapper[4789]: I1216 09:11:47.696926 4789 generic.go:334] "Generic (PLEG): container finished" podID="627329dc-9883-4caa-99b6-31cf32666e8c" containerID="8504a1ecb2cf2d1b6171fc25e3f36998c86f7273d73f6ac2c481f13f85595388" exitCode=0 Dec 16 09:11:47 crc kubenswrapper[4789]: I1216 09:11:47.696977 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s52t7" event={"ID":"627329dc-9883-4caa-99b6-31cf32666e8c","Type":"ContainerDied","Data":"8504a1ecb2cf2d1b6171fc25e3f36998c86f7273d73f6ac2c481f13f85595388"} Dec 16 09:11:49 crc kubenswrapper[4789]: I1216 09:11:49.105350 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:11:49 crc kubenswrapper[4789]: E1216 09:11:49.106112 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:11:49 crc kubenswrapper[4789]: I1216 09:11:49.715693 4789 generic.go:334] "Generic (PLEG): container finished" podID="2be95354-5dbf-401d-9ff0-587d67d57030" containerID="98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606" exitCode=0 Dec 16 09:11:49 crc kubenswrapper[4789]: I1216 09:11:49.715772 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4z5k" event={"ID":"2be95354-5dbf-401d-9ff0-587d67d57030","Type":"ContainerDied","Data":"98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606"} Dec 16 09:11:49 crc kubenswrapper[4789]: I1216 09:11:49.718668 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s52t7" event={"ID":"627329dc-9883-4caa-99b6-31cf32666e8c","Type":"ContainerStarted","Data":"b516b66fdd451e94db5f14486659d3aac94ac05d5190773b8efdabb686a197dd"} Dec 16 09:11:49 crc kubenswrapper[4789]: I1216 09:11:49.774583 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s52t7" podStartSLOduration=3.585041979 podStartE2EDuration="11.774555827s" podCreationTimestamp="2025-12-16 09:11:38 +0000 UTC" firstStartedPulling="2025-12-16 09:11:40.622709042 +0000 UTC m=+8438.884596671" lastFinishedPulling="2025-12-16 09:11:48.81222289 +0000 UTC m=+8447.074110519" observedRunningTime="2025-12-16 09:11:49.764295456 +0000 UTC m=+8448.026183095" watchObservedRunningTime="2025-12-16 09:11:49.774555827 +0000 UTC m=+8448.036443456" Dec 16 09:11:50 crc kubenswrapper[4789]: I1216 09:11:50.730393 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4z5k" event={"ID":"2be95354-5dbf-401d-9ff0-587d67d57030","Type":"ContainerStarted","Data":"0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1"} Dec 16 09:11:50 crc kubenswrapper[4789]: I1216 09:11:50.749550 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g4z5k" podStartSLOduration=4.234987586 podStartE2EDuration="9.74953366s" podCreationTimestamp="2025-12-16 09:11:41 +0000 UTC" firstStartedPulling="2025-12-16 09:11:44.667252236 +0000 UTC m=+8442.929139865" lastFinishedPulling="2025-12-16 09:11:50.18179831 +0000 UTC m=+8448.443685939" observedRunningTime="2025-12-16 09:11:50.747658475 +0000 UTC m=+8449.009546104" watchObservedRunningTime="2025-12-16 09:11:50.74953366 +0000 UTC m=+8449.011421289" Dec 16 09:11:51 crc kubenswrapper[4789]: I1216 09:11:51.907903 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:51 crc kubenswrapper[4789]: I1216 09:11:51.908256 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:11:52 crc kubenswrapper[4789]: I1216 09:11:52.959125 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-g4z5k" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="registry-server" probeResult="failure" output=< Dec 16 09:11:52 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 09:11:52 crc kubenswrapper[4789]: > Dec 16 09:11:58 crc kubenswrapper[4789]: I1216 09:11:58.913388 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:58 crc kubenswrapper[4789]: I1216 09:11:58.914242 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:59 crc kubenswrapper[4789]: I1216 09:11:59.209642 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:59 crc kubenswrapper[4789]: I1216 09:11:59.854107 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:11:59 crc kubenswrapper[4789]: I1216 09:11:59.916346 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s52t7"] Dec 16 09:12:01 crc kubenswrapper[4789]: I1216 09:12:01.825047 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s52t7" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="registry-server" containerID="cri-o://b516b66fdd451e94db5f14486659d3aac94ac05d5190773b8efdabb686a197dd" gracePeriod=2 Dec 16 09:12:01 crc kubenswrapper[4789]: I1216 09:12:01.960526 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:12:02 crc kubenswrapper[4789]: I1216 09:12:02.025113 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:12:02 crc kubenswrapper[4789]: I1216 09:12:02.112615 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:12:02 crc kubenswrapper[4789]: E1216 09:12:02.113171 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:12:02 crc kubenswrapper[4789]: I1216 09:12:02.837817 4789 generic.go:334] "Generic (PLEG): container finished" podID="627329dc-9883-4caa-99b6-31cf32666e8c" containerID="b516b66fdd451e94db5f14486659d3aac94ac05d5190773b8efdabb686a197dd" exitCode=0 Dec 16 09:12:02 crc kubenswrapper[4789]: I1216 09:12:02.837870 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s52t7" event={"ID":"627329dc-9883-4caa-99b6-31cf32666e8c","Type":"ContainerDied","Data":"b516b66fdd451e94db5f14486659d3aac94ac05d5190773b8efdabb686a197dd"} Dec 16 09:12:02 crc kubenswrapper[4789]: I1216 09:12:02.845734 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g4z5k"] Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.454174 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.586646 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5wxw\" (UniqueName: \"kubernetes.io/projected/627329dc-9883-4caa-99b6-31cf32666e8c-kube-api-access-s5wxw\") pod \"627329dc-9883-4caa-99b6-31cf32666e8c\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.586795 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-utilities\") pod \"627329dc-9883-4caa-99b6-31cf32666e8c\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.586838 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-catalog-content\") pod \"627329dc-9883-4caa-99b6-31cf32666e8c\" (UID: \"627329dc-9883-4caa-99b6-31cf32666e8c\") " Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.587555 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-utilities" (OuterVolumeSpecName: "utilities") pod "627329dc-9883-4caa-99b6-31cf32666e8c" (UID: "627329dc-9883-4caa-99b6-31cf32666e8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.592452 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627329dc-9883-4caa-99b6-31cf32666e8c-kube-api-access-s5wxw" (OuterVolumeSpecName: "kube-api-access-s5wxw") pod "627329dc-9883-4caa-99b6-31cf32666e8c" (UID: "627329dc-9883-4caa-99b6-31cf32666e8c"). InnerVolumeSpecName "kube-api-access-s5wxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.689757 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5wxw\" (UniqueName: \"kubernetes.io/projected/627329dc-9883-4caa-99b6-31cf32666e8c-kube-api-access-s5wxw\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.689806 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.707079 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "627329dc-9883-4caa-99b6-31cf32666e8c" (UID: "627329dc-9883-4caa-99b6-31cf32666e8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.791870 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/627329dc-9883-4caa-99b6-31cf32666e8c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.850167 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s52t7" event={"ID":"627329dc-9883-4caa-99b6-31cf32666e8c","Type":"ContainerDied","Data":"25f8554ad8c633c542d7add046cb18067e1997fdcbe613cc1f212675b4b6eec0"} Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.850238 4789 scope.go:117] "RemoveContainer" containerID="b516b66fdd451e94db5f14486659d3aac94ac05d5190773b8efdabb686a197dd" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.850183 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s52t7" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.850680 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g4z5k" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="registry-server" containerID="cri-o://0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1" gracePeriod=2 Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.889223 4789 scope.go:117] "RemoveContainer" containerID="8504a1ecb2cf2d1b6171fc25e3f36998c86f7273d73f6ac2c481f13f85595388" Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.889470 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s52t7"] Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.900757 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s52t7"] Dec 16 09:12:03 crc kubenswrapper[4789]: I1216 09:12:03.917043 4789 scope.go:117] "RemoveContainer" containerID="976b50e69e7716e5115061095d59a521358d3b130fe3781f8dedd40ccbce2740" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.118441 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" path="/var/lib/kubelet/pods/627329dc-9883-4caa-99b6-31cf32666e8c/volumes" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.284306 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.404211 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js7q9\" (UniqueName: \"kubernetes.io/projected/2be95354-5dbf-401d-9ff0-587d67d57030-kube-api-access-js7q9\") pod \"2be95354-5dbf-401d-9ff0-587d67d57030\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.404258 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-catalog-content\") pod \"2be95354-5dbf-401d-9ff0-587d67d57030\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.404509 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-utilities\") pod \"2be95354-5dbf-401d-9ff0-587d67d57030\" (UID: \"2be95354-5dbf-401d-9ff0-587d67d57030\") " Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.406404 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-utilities" (OuterVolumeSpecName: "utilities") pod "2be95354-5dbf-401d-9ff0-587d67d57030" (UID: "2be95354-5dbf-401d-9ff0-587d67d57030"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.411080 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be95354-5dbf-401d-9ff0-587d67d57030-kube-api-access-js7q9" (OuterVolumeSpecName: "kube-api-access-js7q9") pod "2be95354-5dbf-401d-9ff0-587d67d57030" (UID: "2be95354-5dbf-401d-9ff0-587d67d57030"). InnerVolumeSpecName "kube-api-access-js7q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.461389 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2be95354-5dbf-401d-9ff0-587d67d57030" (UID: "2be95354-5dbf-401d-9ff0-587d67d57030"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.507320 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js7q9\" (UniqueName: \"kubernetes.io/projected/2be95354-5dbf-401d-9ff0-587d67d57030-kube-api-access-js7q9\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.507355 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.507364 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2be95354-5dbf-401d-9ff0-587d67d57030-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.866782 4789 generic.go:334] "Generic (PLEG): container finished" podID="2be95354-5dbf-401d-9ff0-587d67d57030" containerID="0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1" exitCode=0 Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.866847 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g4z5k" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.866902 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4z5k" event={"ID":"2be95354-5dbf-401d-9ff0-587d67d57030","Type":"ContainerDied","Data":"0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1"} Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.866986 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g4z5k" event={"ID":"2be95354-5dbf-401d-9ff0-587d67d57030","Type":"ContainerDied","Data":"eb4003b9a0f123ef6cba32b0904654312f67c152a55d06bd4c6bc46d28e37012"} Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.867010 4789 scope.go:117] "RemoveContainer" containerID="0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.909660 4789 scope.go:117] "RemoveContainer" containerID="98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.921139 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g4z5k"] Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.935831 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g4z5k"] Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.941486 4789 scope.go:117] "RemoveContainer" containerID="5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.982108 4789 scope.go:117] "RemoveContainer" containerID="0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1" Dec 16 09:12:04 crc kubenswrapper[4789]: E1216 09:12:04.982635 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1\": container with ID starting with 0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1 not found: ID does not exist" containerID="0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.982696 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1"} err="failed to get container status \"0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1\": rpc error: code = NotFound desc = could not find container \"0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1\": container with ID starting with 0e31386860d508a6f36aa03671fb7b649f3e0a71da91de330fecb2ec52ba9cf1 not found: ID does not exist" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.982751 4789 scope.go:117] "RemoveContainer" containerID="98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606" Dec 16 09:12:04 crc kubenswrapper[4789]: E1216 09:12:04.983804 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606\": container with ID starting with 98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606 not found: ID does not exist" containerID="98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.983892 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606"} err="failed to get container status \"98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606\": rpc error: code = NotFound desc = could not find container \"98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606\": container with ID starting with 98a35a8342c35ae3c60ec0c409bfaf65945c4abd70a0d10f48be9a880bc5f606 not found: ID does not exist" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.983987 4789 scope.go:117] "RemoveContainer" containerID="5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa" Dec 16 09:12:04 crc kubenswrapper[4789]: E1216 09:12:04.984467 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa\": container with ID starting with 5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa not found: ID does not exist" containerID="5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa" Dec 16 09:12:04 crc kubenswrapper[4789]: I1216 09:12:04.984505 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa"} err="failed to get container status \"5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa\": rpc error: code = NotFound desc = could not find container \"5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa\": container with ID starting with 5430cd3c8a4f03242be599f9639ac646d1ada79a2ae36e860bb6c91f9edfbafa not found: ID does not exist" Dec 16 09:12:06 crc kubenswrapper[4789]: I1216 09:12:06.117448 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" path="/var/lib/kubelet/pods/2be95354-5dbf-401d-9ff0-587d67d57030/volumes" Dec 16 09:12:17 crc kubenswrapper[4789]: I1216 09:12:17.104870 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:12:17 crc kubenswrapper[4789]: E1216 09:12:17.105691 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:12:30 crc kubenswrapper[4789]: I1216 09:12:30.105527 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:12:31 crc kubenswrapper[4789]: I1216 09:12:31.110126 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"849ec36ca6ea5e90f98f29a11e0d3e94cc588787a9f4c7ff35d827cbbcbb4e10"} Dec 16 09:13:34 crc kubenswrapper[4789]: I1216 09:13:34.704406 4789 generic.go:334] "Generic (PLEG): container finished" podID="12ce9e20-a637-474c-862b-a8c47381fda9" containerID="039eb5b157f17c194e2e1cc961160655fcdda0bb018511ee3dbeaf1f7563c8a0" exitCode=0 Dec 16 09:13:34 crc kubenswrapper[4789]: I1216 09:13:34.704500 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" event={"ID":"12ce9e20-a637-474c-862b-a8c47381fda9","Type":"ContainerDied","Data":"039eb5b157f17c194e2e1cc961160655fcdda0bb018511ee3dbeaf1f7563c8a0"} Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.211062 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361496 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-0\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361560 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-1\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361613 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4zt6\" (UniqueName: \"kubernetes.io/projected/12ce9e20-a637-474c-862b-a8c47381fda9-kube-api-access-m4zt6\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361703 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ceph\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361743 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-inventory\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361783 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-combined-ca-bundle\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361822 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ssh-key\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361899 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-1\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.361983 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-1\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.362032 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-0\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.362194 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-0\") pod \"12ce9e20-a637-474c-862b-a8c47381fda9\" (UID: \"12ce9e20-a637-474c-862b-a8c47381fda9\") " Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.368049 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.368408 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ceph" (OuterVolumeSpecName: "ceph") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.369210 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ce9e20-a637-474c-862b-a8c47381fda9-kube-api-access-m4zt6" (OuterVolumeSpecName: "kube-api-access-m4zt6") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "kube-api-access-m4zt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.391367 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.392671 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.396064 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.396243 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.396627 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.398687 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.400736 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.410155 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-inventory" (OuterVolumeSpecName: "inventory") pod "12ce9e20-a637-474c-862b-a8c47381fda9" (UID: "12ce9e20-a637-474c-862b-a8c47381fda9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465044 4789 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465088 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465100 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465108 4789 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465118 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465127 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4zt6\" (UniqueName: \"kubernetes.io/projected/12ce9e20-a637-474c-862b-a8c47381fda9-kube-api-access-m4zt6\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465136 4789 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ceph\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465144 4789 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465153 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465166 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.465177 4789 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/12ce9e20-a637-474c-862b-a8c47381fda9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.727272 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" event={"ID":"12ce9e20-a637-474c-862b-a8c47381fda9","Type":"ContainerDied","Data":"6e2929244e62d13ab487493f822ba94163f926b9d1da315bc5fa48896c340e42"} Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.727593 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e2929244e62d13ab487493f822ba94163f926b9d1da315bc5fa48896c340e42" Dec 16 09:13:36 crc kubenswrapper[4789]: I1216 09:13:36.727337 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.070566 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8nkks"] Dec 16 09:14:20 crc kubenswrapper[4789]: E1216 09:14:20.071623 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="extract-utilities" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071636 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="extract-utilities" Dec 16 09:14:20 crc kubenswrapper[4789]: E1216 09:14:20.071645 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="registry-server" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071654 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="registry-server" Dec 16 09:14:20 crc kubenswrapper[4789]: E1216 09:14:20.071670 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="extract-content" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071676 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="extract-content" Dec 16 09:14:20 crc kubenswrapper[4789]: E1216 09:14:20.071692 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="registry-server" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071698 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="registry-server" Dec 16 09:14:20 crc kubenswrapper[4789]: E1216 09:14:20.071709 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="extract-utilities" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071715 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="extract-utilities" Dec 16 09:14:20 crc kubenswrapper[4789]: E1216 09:14:20.071732 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ce9e20-a637-474c-862b-a8c47381fda9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071738 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ce9e20-a637-474c-862b-a8c47381fda9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 16 09:14:20 crc kubenswrapper[4789]: E1216 09:14:20.071752 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="extract-content" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071759 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="extract-content" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071935 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be95354-5dbf-401d-9ff0-587d67d57030" containerName="registry-server" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071961 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ce9e20-a637-474c-862b-a8c47381fda9" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.071976 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="627329dc-9883-4caa-99b6-31cf32666e8c" containerName="registry-server" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.073529 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.086421 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nkks"] Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.213334 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-utilities\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.213470 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-catalog-content\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.213535 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xp68\" (UniqueName: \"kubernetes.io/projected/d82d17f5-52d7-4ffe-bdff-76303ad86665-kube-api-access-5xp68\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.315317 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-catalog-content\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.315417 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xp68\" (UniqueName: \"kubernetes.io/projected/d82d17f5-52d7-4ffe-bdff-76303ad86665-kube-api-access-5xp68\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.315504 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-utilities\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.315894 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-catalog-content\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.315964 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-utilities\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.346782 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xp68\" (UniqueName: \"kubernetes.io/projected/d82d17f5-52d7-4ffe-bdff-76303ad86665-kube-api-access-5xp68\") pod \"community-operators-8nkks\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.399824 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:20 crc kubenswrapper[4789]: I1216 09:14:20.954366 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8nkks"] Dec 16 09:14:21 crc kubenswrapper[4789]: I1216 09:14:21.176326 4789 generic.go:334] "Generic (PLEG): container finished" podID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerID="3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9" exitCode=0 Dec 16 09:14:21 crc kubenswrapper[4789]: I1216 09:14:21.176570 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nkks" event={"ID":"d82d17f5-52d7-4ffe-bdff-76303ad86665","Type":"ContainerDied","Data":"3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9"} Dec 16 09:14:21 crc kubenswrapper[4789]: I1216 09:14:21.176594 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nkks" event={"ID":"d82d17f5-52d7-4ffe-bdff-76303ad86665","Type":"ContainerStarted","Data":"57bf51e03ca2a8af21b85d22b92b078de0744f97b1e1aa41177032d649f0c5a7"} Dec 16 09:14:22 crc kubenswrapper[4789]: I1216 09:14:22.192495 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nkks" event={"ID":"d82d17f5-52d7-4ffe-bdff-76303ad86665","Type":"ContainerStarted","Data":"b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b"} Dec 16 09:14:23 crc kubenswrapper[4789]: I1216 09:14:23.203294 4789 generic.go:334] "Generic (PLEG): container finished" podID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerID="b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b" exitCode=0 Dec 16 09:14:23 crc kubenswrapper[4789]: I1216 09:14:23.203362 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nkks" event={"ID":"d82d17f5-52d7-4ffe-bdff-76303ad86665","Type":"ContainerDied","Data":"b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b"} Dec 16 09:14:24 crc kubenswrapper[4789]: I1216 09:14:24.215780 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nkks" event={"ID":"d82d17f5-52d7-4ffe-bdff-76303ad86665","Type":"ContainerStarted","Data":"e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9"} Dec 16 09:14:24 crc kubenswrapper[4789]: I1216 09:14:24.238319 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8nkks" podStartSLOduration=1.718148005 podStartE2EDuration="4.238298913s" podCreationTimestamp="2025-12-16 09:14:20 +0000 UTC" firstStartedPulling="2025-12-16 09:14:21.178026808 +0000 UTC m=+8599.439914437" lastFinishedPulling="2025-12-16 09:14:23.698177716 +0000 UTC m=+8601.960065345" observedRunningTime="2025-12-16 09:14:24.231407835 +0000 UTC m=+8602.493295464" watchObservedRunningTime="2025-12-16 09:14:24.238298913 +0000 UTC m=+8602.500186542" Dec 16 09:14:30 crc kubenswrapper[4789]: I1216 09:14:30.400850 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:30 crc kubenswrapper[4789]: I1216 09:14:30.401514 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:30 crc kubenswrapper[4789]: I1216 09:14:30.447164 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:31 crc kubenswrapper[4789]: I1216 09:14:31.368877 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:31 crc kubenswrapper[4789]: I1216 09:14:31.433281 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nkks"] Dec 16 09:14:33 crc kubenswrapper[4789]: I1216 09:14:33.336131 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8nkks" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="registry-server" containerID="cri-o://e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9" gracePeriod=2 Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.332654 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.353847 4789 generic.go:334] "Generic (PLEG): container finished" podID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerID="e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9" exitCode=0 Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.353905 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8nkks" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.353981 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nkks" event={"ID":"d82d17f5-52d7-4ffe-bdff-76303ad86665","Type":"ContainerDied","Data":"e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9"} Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.354735 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8nkks" event={"ID":"d82d17f5-52d7-4ffe-bdff-76303ad86665","Type":"ContainerDied","Data":"57bf51e03ca2a8af21b85d22b92b078de0744f97b1e1aa41177032d649f0c5a7"} Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.354762 4789 scope.go:117] "RemoveContainer" containerID="e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.398690 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-catalog-content\") pod \"d82d17f5-52d7-4ffe-bdff-76303ad86665\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.398748 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-utilities\") pod \"d82d17f5-52d7-4ffe-bdff-76303ad86665\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.398815 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xp68\" (UniqueName: \"kubernetes.io/projected/d82d17f5-52d7-4ffe-bdff-76303ad86665-kube-api-access-5xp68\") pod \"d82d17f5-52d7-4ffe-bdff-76303ad86665\" (UID: \"d82d17f5-52d7-4ffe-bdff-76303ad86665\") " Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.400060 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-utilities" (OuterVolumeSpecName: "utilities") pod "d82d17f5-52d7-4ffe-bdff-76303ad86665" (UID: "d82d17f5-52d7-4ffe-bdff-76303ad86665"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.404718 4789 scope.go:117] "RemoveContainer" containerID="b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.404865 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82d17f5-52d7-4ffe-bdff-76303ad86665-kube-api-access-5xp68" (OuterVolumeSpecName: "kube-api-access-5xp68") pod "d82d17f5-52d7-4ffe-bdff-76303ad86665" (UID: "d82d17f5-52d7-4ffe-bdff-76303ad86665"). InnerVolumeSpecName "kube-api-access-5xp68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.471465 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d82d17f5-52d7-4ffe-bdff-76303ad86665" (UID: "d82d17f5-52d7-4ffe-bdff-76303ad86665"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.473479 4789 scope.go:117] "RemoveContainer" containerID="3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.501239 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.501277 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82d17f5-52d7-4ffe-bdff-76303ad86665-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.501292 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xp68\" (UniqueName: \"kubernetes.io/projected/d82d17f5-52d7-4ffe-bdff-76303ad86665-kube-api-access-5xp68\") on node \"crc\" DevicePath \"\"" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.515910 4789 scope.go:117] "RemoveContainer" containerID="e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9" Dec 16 09:14:34 crc kubenswrapper[4789]: E1216 09:14:34.516459 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9\": container with ID starting with e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9 not found: ID does not exist" containerID="e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.516496 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9"} err="failed to get container status \"e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9\": rpc error: code = NotFound desc = could not find container \"e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9\": container with ID starting with e94be8357ba25e3429e7f25588dd5391a914328d75bae2f0a2bef211bdc9bed9 not found: ID does not exist" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.516520 4789 scope.go:117] "RemoveContainer" containerID="b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b" Dec 16 09:14:34 crc kubenswrapper[4789]: E1216 09:14:34.517191 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b\": container with ID starting with b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b not found: ID does not exist" containerID="b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.517220 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b"} err="failed to get container status \"b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b\": rpc error: code = NotFound desc = could not find container \"b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b\": container with ID starting with b51c378342ff473ac6aa069f23e0079a67da0fe8110ab331f3ef27396ba1880b not found: ID does not exist" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.517239 4789 scope.go:117] "RemoveContainer" containerID="3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9" Dec 16 09:14:34 crc kubenswrapper[4789]: E1216 09:14:34.517480 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9\": container with ID starting with 3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9 not found: ID does not exist" containerID="3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.517512 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9"} err="failed to get container status \"3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9\": rpc error: code = NotFound desc = could not find container \"3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9\": container with ID starting with 3d2f7919334c1084220e2759a01c84b5045849940b893216fab06cb34b7836b9 not found: ID does not exist" Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.719398 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8nkks"] Dec 16 09:14:34 crc kubenswrapper[4789]: I1216 09:14:34.731399 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8nkks"] Dec 16 09:14:36 crc kubenswrapper[4789]: I1216 09:14:36.116187 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" path="/var/lib/kubelet/pods/d82d17f5-52d7-4ffe-bdff-76303ad86665/volumes" Dec 16 09:14:51 crc kubenswrapper[4789]: I1216 09:14:51.928369 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:14:51 crc kubenswrapper[4789]: I1216 09:14:51.928900 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.154754 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc"] Dec 16 09:15:00 crc kubenswrapper[4789]: E1216 09:15:00.155792 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="extract-content" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.155806 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="extract-content" Dec 16 09:15:00 crc kubenswrapper[4789]: E1216 09:15:00.155841 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="registry-server" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.155851 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="registry-server" Dec 16 09:15:00 crc kubenswrapper[4789]: E1216 09:15:00.155868 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="extract-utilities" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.155874 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="extract-utilities" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.156107 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82d17f5-52d7-4ffe-bdff-76303ad86665" containerName="registry-server" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.156998 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.159496 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.163290 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc"] Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.165665 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.191202 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqk7\" (UniqueName: \"kubernetes.io/projected/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-kube-api-access-8wqk7\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.191354 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-secret-volume\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.191424 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-config-volume\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.294638 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-secret-volume\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.294761 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-config-volume\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.295868 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-config-volume\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.295967 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqk7\" (UniqueName: \"kubernetes.io/projected/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-kube-api-access-8wqk7\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.301140 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-secret-volume\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.314892 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqk7\" (UniqueName: \"kubernetes.io/projected/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-kube-api-access-8wqk7\") pod \"collect-profiles-29431275-6j8lc\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.502755 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:00 crc kubenswrapper[4789]: I1216 09:15:00.976802 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc"] Dec 16 09:15:01 crc kubenswrapper[4789]: I1216 09:15:01.633960 4789 generic.go:334] "Generic (PLEG): container finished" podID="05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4" containerID="87d1be9588870b221cfca0a8a171d855f916166a9524121e86758be25b0e859c" exitCode=0 Dec 16 09:15:01 crc kubenswrapper[4789]: I1216 09:15:01.634224 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" event={"ID":"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4","Type":"ContainerDied","Data":"87d1be9588870b221cfca0a8a171d855f916166a9524121e86758be25b0e859c"} Dec 16 09:15:01 crc kubenswrapper[4789]: I1216 09:15:01.634286 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" event={"ID":"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4","Type":"ContainerStarted","Data":"e934c71008b419d4df5d2666b0b18a959959172a6710b7ec46f5869578fdc24e"} Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.026612 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.063900 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-config-volume\") pod \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.064042 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqk7\" (UniqueName: \"kubernetes.io/projected/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-kube-api-access-8wqk7\") pod \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.064418 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-config-volume" (OuterVolumeSpecName: "config-volume") pod "05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4" (UID: "05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.065003 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-secret-volume\") pod \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\" (UID: \"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4\") " Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.066422 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.070372 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-kube-api-access-8wqk7" (OuterVolumeSpecName: "kube-api-access-8wqk7") pod "05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4" (UID: "05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4"). InnerVolumeSpecName "kube-api-access-8wqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.076675 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4" (UID: "05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.168604 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wqk7\" (UniqueName: \"kubernetes.io/projected/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-kube-api-access-8wqk7\") on node \"crc\" DevicePath \"\"" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.168640 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.656924 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" event={"ID":"05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4","Type":"ContainerDied","Data":"e934c71008b419d4df5d2666b0b18a959959172a6710b7ec46f5869578fdc24e"} Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.657128 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e934c71008b419d4df5d2666b0b18a959959172a6710b7ec46f5869578fdc24e" Dec 16 09:15:03 crc kubenswrapper[4789]: I1216 09:15:03.656985 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431275-6j8lc" Dec 16 09:15:04 crc kubenswrapper[4789]: I1216 09:15:04.095461 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx"] Dec 16 09:15:04 crc kubenswrapper[4789]: I1216 09:15:04.103746 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431230-2kfnx"] Dec 16 09:15:04 crc kubenswrapper[4789]: I1216 09:15:04.116736 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5ed2f6-2519-4162-b31d-16fb006bc53d" path="/var/lib/kubelet/pods/fa5ed2f6-2519-4162-b31d-16fb006bc53d/volumes" Dec 16 09:15:21 crc kubenswrapper[4789]: I1216 09:15:21.928228 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:15:21 crc kubenswrapper[4789]: I1216 09:15:21.928832 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:15:34 crc kubenswrapper[4789]: I1216 09:15:34.972000 4789 scope.go:117] "RemoveContainer" containerID="2722fb02af00f277d6613ad5320fcf6fcc623e5f628b7203193a965f3f709d7c" Dec 16 09:15:50 crc kubenswrapper[4789]: I1216 09:15:50.314282 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 09:15:50 crc kubenswrapper[4789]: I1216 09:15:50.316202 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="cbc80106-cc75-47d5-beaf-6d9c7c20d41e" containerName="adoption" containerID="cri-o://a68068a499e399865abb8dafc2e1c76db2509b91ee4934d47c506e10c5b07e01" gracePeriod=30 Dec 16 09:15:51 crc kubenswrapper[4789]: I1216 09:15:51.928351 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:15:51 crc kubenswrapper[4789]: I1216 09:15:51.928452 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:15:51 crc kubenswrapper[4789]: I1216 09:15:51.928521 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 09:15:51 crc kubenswrapper[4789]: I1216 09:15:51.929783 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"849ec36ca6ea5e90f98f29a11e0d3e94cc588787a9f4c7ff35d827cbbcbb4e10"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:15:51 crc kubenswrapper[4789]: I1216 09:15:51.929871 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://849ec36ca6ea5e90f98f29a11e0d3e94cc588787a9f4c7ff35d827cbbcbb4e10" gracePeriod=600 Dec 16 09:15:52 crc kubenswrapper[4789]: I1216 09:15:52.158004 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="849ec36ca6ea5e90f98f29a11e0d3e94cc588787a9f4c7ff35d827cbbcbb4e10" exitCode=0 Dec 16 09:15:52 crc kubenswrapper[4789]: I1216 09:15:52.158081 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"849ec36ca6ea5e90f98f29a11e0d3e94cc588787a9f4c7ff35d827cbbcbb4e10"} Dec 16 09:15:52 crc kubenswrapper[4789]: I1216 09:15:52.158472 4789 scope.go:117] "RemoveContainer" containerID="d6bc41f72472bdd1a2c84b83af196c20748d38f6a50b24244a0c6a9ed5a00359" Dec 16 09:15:53 crc kubenswrapper[4789]: I1216 09:15:53.170673 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8"} Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.860371 4789 generic.go:334] "Generic (PLEG): container finished" podID="cbc80106-cc75-47d5-beaf-6d9c7c20d41e" containerID="a68068a499e399865abb8dafc2e1c76db2509b91ee4934d47c506e10c5b07e01" exitCode=137 Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.864109 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cbc80106-cc75-47d5-beaf-6d9c7c20d41e","Type":"ContainerDied","Data":"a68068a499e399865abb8dafc2e1c76db2509b91ee4934d47c506e10c5b07e01"} Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.865816 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"cbc80106-cc75-47d5-beaf-6d9c7c20d41e","Type":"ContainerDied","Data":"27e65e5a5c6929d7ba8f9391af75d2c494e2a6aed43189f741611ecfd601880f"} Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.865861 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e65e5a5c6929d7ba8f9391af75d2c494e2a6aed43189f741611ecfd601880f" Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.869765 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.961417 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\") pod \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.961553 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rlsh\" (UniqueName: \"kubernetes.io/projected/cbc80106-cc75-47d5-beaf-6d9c7c20d41e-kube-api-access-9rlsh\") pod \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\" (UID: \"cbc80106-cc75-47d5-beaf-6d9c7c20d41e\") " Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.970767 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc80106-cc75-47d5-beaf-6d9c7c20d41e-kube-api-access-9rlsh" (OuterVolumeSpecName: "kube-api-access-9rlsh") pod "cbc80106-cc75-47d5-beaf-6d9c7c20d41e" (UID: "cbc80106-cc75-47d5-beaf-6d9c7c20d41e"). InnerVolumeSpecName "kube-api-access-9rlsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:16:20 crc kubenswrapper[4789]: I1216 09:16:20.981241 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46" (OuterVolumeSpecName: "mariadb-data") pod "cbc80106-cc75-47d5-beaf-6d9c7c20d41e" (UID: "cbc80106-cc75-47d5-beaf-6d9c7c20d41e"). InnerVolumeSpecName "pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.065217 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\") on node \"crc\" " Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.065259 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rlsh\" (UniqueName: \"kubernetes.io/projected/cbc80106-cc75-47d5-beaf-6d9c7c20d41e-kube-api-access-9rlsh\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.089556 4789 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.089734 4789 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46") on node "crc" Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.169176 4789 reconciler_common.go:293] "Volume detached for volume \"pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f28fac8-c78d-4b8d-bed1-838604c1ed46\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.877117 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.917145 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 09:16:21 crc kubenswrapper[4789]: I1216 09:16:21.928057 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 16 09:16:22 crc kubenswrapper[4789]: I1216 09:16:22.116205 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc80106-cc75-47d5-beaf-6d9c7c20d41e" path="/var/lib/kubelet/pods/cbc80106-cc75-47d5-beaf-6d9c7c20d41e/volumes" Dec 16 09:16:22 crc kubenswrapper[4789]: I1216 09:16:22.662101 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 09:16:22 crc kubenswrapper[4789]: I1216 09:16:22.662319 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="4ca28de1-5df8-477e-9068-55523ad13390" containerName="adoption" containerID="cri-o://a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f" gracePeriod=30 Dec 16 09:16:35 crc kubenswrapper[4789]: I1216 09:16:35.075479 4789 scope.go:117] "RemoveContainer" containerID="a68068a499e399865abb8dafc2e1c76db2509b91ee4934d47c506e10c5b07e01" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.187003 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.209716 4789 generic.go:334] "Generic (PLEG): container finished" podID="4ca28de1-5df8-477e-9068-55523ad13390" containerID="a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f" exitCode=137 Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.209758 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4ca28de1-5df8-477e-9068-55523ad13390","Type":"ContainerDied","Data":"a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f"} Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.209786 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4ca28de1-5df8-477e-9068-55523ad13390","Type":"ContainerDied","Data":"9ed0c063f413002d2196de194566ac6589a547263ea761adf484a8f865a3f4a8"} Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.209777 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.209804 4789 scope.go:117] "RemoveContainer" containerID="a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.286102 4789 scope.go:117] "RemoveContainer" containerID="a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f" Dec 16 09:16:53 crc kubenswrapper[4789]: E1216 09:16:53.288344 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f\": container with ID starting with a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f not found: ID does not exist" containerID="a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.288401 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f"} err="failed to get container status \"a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f\": rpc error: code = NotFound desc = could not find container \"a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f\": container with ID starting with a66b608bb0f25a1a2bd250a010a3accc48b9d7693ac328e91139817fd8752e2f not found: ID does not exist" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.292522 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4ca28de1-5df8-477e-9068-55523ad13390-ovn-data-cert\") pod \"4ca28de1-5df8-477e-9068-55523ad13390\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.293854 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\") pod \"4ca28de1-5df8-477e-9068-55523ad13390\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.295073 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7jwj\" (UniqueName: \"kubernetes.io/projected/4ca28de1-5df8-477e-9068-55523ad13390-kube-api-access-b7jwj\") pod \"4ca28de1-5df8-477e-9068-55523ad13390\" (UID: \"4ca28de1-5df8-477e-9068-55523ad13390\") " Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.310320 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca28de1-5df8-477e-9068-55523ad13390-kube-api-access-b7jwj" (OuterVolumeSpecName: "kube-api-access-b7jwj") pod "4ca28de1-5df8-477e-9068-55523ad13390" (UID: "4ca28de1-5df8-477e-9068-55523ad13390"). InnerVolumeSpecName "kube-api-access-b7jwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.310439 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ca28de1-5df8-477e-9068-55523ad13390-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "4ca28de1-5df8-477e-9068-55523ad13390" (UID: "4ca28de1-5df8-477e-9068-55523ad13390"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.315730 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c" (OuterVolumeSpecName: "ovn-data") pod "4ca28de1-5df8-477e-9068-55523ad13390" (UID: "4ca28de1-5df8-477e-9068-55523ad13390"). InnerVolumeSpecName "pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.401672 4789 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4ca28de1-5df8-477e-9068-55523ad13390-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.401729 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\") on node \"crc\" " Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.401742 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7jwj\" (UniqueName: \"kubernetes.io/projected/4ca28de1-5df8-477e-9068-55523ad13390-kube-api-access-b7jwj\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.426831 4789 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.427007 4789 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c") on node "crc" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.503795 4789 reconciler_common.go:293] "Volume detached for volume \"pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b468a0cd-e4bb-4e48-80e8-26236729b31c\") on node \"crc\" DevicePath \"\"" Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.564538 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 09:16:53 crc kubenswrapper[4789]: I1216 09:16:53.572868 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 16 09:16:54 crc kubenswrapper[4789]: I1216 09:16:54.115675 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca28de1-5df8-477e-9068-55523ad13390" path="/var/lib/kubelet/pods/4ca28de1-5df8-477e-9068-55523ad13390/volumes" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.758670 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 09:17:13 crc kubenswrapper[4789]: E1216 09:17:13.760364 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca28de1-5df8-477e-9068-55523ad13390" containerName="adoption" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.760385 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca28de1-5df8-477e-9068-55523ad13390" containerName="adoption" Dec 16 09:17:13 crc kubenswrapper[4789]: E1216 09:17:13.760407 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc80106-cc75-47d5-beaf-6d9c7c20d41e" containerName="adoption" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.760414 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc80106-cc75-47d5-beaf-6d9c7c20d41e" containerName="adoption" Dec 16 09:17:13 crc kubenswrapper[4789]: E1216 09:17:13.760439 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4" containerName="collect-profiles" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.760449 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4" containerName="collect-profiles" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.760677 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca28de1-5df8-477e-9068-55523ad13390" containerName="adoption" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.760690 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc80106-cc75-47d5-beaf-6d9c7c20d41e" containerName="adoption" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.760710 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f5bd28-a4f3-4b58-ae3f-1325f36dc0c4" containerName="collect-profiles" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.761793 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.764439 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.764933 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.765163 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lsf8v" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.777650 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.777737 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.831585 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.831649 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.831797 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.832207 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-config-data\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.832270 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk4pp\" (UniqueName: \"kubernetes.io/projected/ea4afd4b-996e-4079-83b9-f2c3e8242de1-kube-api-access-bk4pp\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.832418 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.832520 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.832631 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.832862 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.935390 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.936031 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-config-data\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.936169 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk4pp\" (UniqueName: \"kubernetes.io/projected/ea4afd4b-996e-4079-83b9-f2c3e8242de1-kube-api-access-bk4pp\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.936329 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.936452 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.936477 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.936737 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.937092 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.937298 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.937643 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.937831 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.937943 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-config-data\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.938083 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.938985 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.944351 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.944998 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.945487 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.962801 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk4pp\" (UniqueName: \"kubernetes.io/projected/ea4afd4b-996e-4079-83b9-f2c3e8242de1-kube-api-access-bk4pp\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:13 crc kubenswrapper[4789]: I1216 09:17:13.975524 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " pod="openstack/tempest-tests-tempest" Dec 16 09:17:14 crc kubenswrapper[4789]: I1216 09:17:14.084950 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 09:17:14 crc kubenswrapper[4789]: I1216 09:17:14.595827 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 09:17:14 crc kubenswrapper[4789]: I1216 09:17:14.774695 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:17:15 crc kubenswrapper[4789]: I1216 09:17:15.415675 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ea4afd4b-996e-4079-83b9-f2c3e8242de1","Type":"ContainerStarted","Data":"9dd5e1d3eace1ed8535479f65c99575666105fc731ef0afa94e7bd175033564b"} Dec 16 09:18:05 crc kubenswrapper[4789]: E1216 09:18:05.872479 4789 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 09:18:05 crc kubenswrapper[4789]: E1216 09:18:05.873058 4789 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3a837a7c939c44c9106d2b2c7c72015" Dec 16 09:18:05 crc kubenswrapper[4789]: E1216 09:18:05.873208 4789 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3a837a7c939c44c9106d2b2c7c72015,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bk4pp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ea4afd4b-996e-4079-83b9-f2c3e8242de1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 09:18:05 crc kubenswrapper[4789]: E1216 09:18:05.875217 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ea4afd4b-996e-4079-83b9-f2c3e8242de1" Dec 16 09:18:06 crc kubenswrapper[4789]: E1216 09:18:06.086132 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c3a837a7c939c44c9106d2b2c7c72015\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ea4afd4b-996e-4079-83b9-f2c3e8242de1" Dec 16 09:18:18 crc kubenswrapper[4789]: I1216 09:18:18.330229 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 09:18:20 crc kubenswrapper[4789]: I1216 09:18:20.204763 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ea4afd4b-996e-4079-83b9-f2c3e8242de1","Type":"ContainerStarted","Data":"d4ae0350b5e034c789ccc896e628f89eca6511fadd8f92932d7d7c5141a16a29"} Dec 16 09:18:20 crc kubenswrapper[4789]: I1216 09:18:20.234480 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.681624039 podStartE2EDuration="1m8.234457551s" podCreationTimestamp="2025-12-16 09:17:12 +0000 UTC" firstStartedPulling="2025-12-16 09:17:14.774430294 +0000 UTC m=+8773.036317923" lastFinishedPulling="2025-12-16 09:18:18.327263806 +0000 UTC m=+8836.589151435" observedRunningTime="2025-12-16 09:18:20.224699294 +0000 UTC m=+8838.486586923" watchObservedRunningTime="2025-12-16 09:18:20.234457551 +0000 UTC m=+8838.496345180" Dec 16 09:18:21 crc kubenswrapper[4789]: I1216 09:18:21.927601 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:18:21 crc kubenswrapper[4789]: I1216 09:18:21.928200 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:18:51 crc kubenswrapper[4789]: I1216 09:18:51.927669 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:18:51 crc kubenswrapper[4789]: I1216 09:18:51.929197 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.686961 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-29rcd"] Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.690213 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.699059 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29rcd"] Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.750323 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-kube-api-access-sl6fx\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.750787 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-catalog-content\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.751341 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-utilities\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.853805 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-kube-api-access-sl6fx\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.854163 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-catalog-content\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.854274 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-utilities\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.854767 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-catalog-content\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.854804 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-utilities\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:17 crc kubenswrapper[4789]: I1216 09:19:17.879154 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-kube-api-access-sl6fx\") pod \"redhat-marketplace-29rcd\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:18 crc kubenswrapper[4789]: I1216 09:19:18.034849 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:18 crc kubenswrapper[4789]: I1216 09:19:18.573051 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29rcd"] Dec 16 09:19:19 crc kubenswrapper[4789]: I1216 09:19:19.775197 4789 generic.go:334] "Generic (PLEG): container finished" podID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerID="e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1" exitCode=0 Dec 16 09:19:19 crc kubenswrapper[4789]: I1216 09:19:19.775248 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29rcd" event={"ID":"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c","Type":"ContainerDied","Data":"e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1"} Dec 16 09:19:19 crc kubenswrapper[4789]: I1216 09:19:19.775646 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29rcd" event={"ID":"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c","Type":"ContainerStarted","Data":"a8a9d1f4c5ac3d4469793f60765bc6b52cac13edf8816f232564ef0f36ada585"} Dec 16 09:19:21 crc kubenswrapper[4789]: I1216 09:19:21.928035 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:19:21 crc kubenswrapper[4789]: I1216 09:19:21.928617 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:19:21 crc kubenswrapper[4789]: I1216 09:19:21.928671 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 09:19:21 crc kubenswrapper[4789]: I1216 09:19:21.929557 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:19:21 crc kubenswrapper[4789]: I1216 09:19:21.929606 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" gracePeriod=600 Dec 16 09:19:22 crc kubenswrapper[4789]: E1216 09:19:22.117887 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:19:22 crc kubenswrapper[4789]: I1216 09:19:22.802070 4789 generic.go:334] "Generic (PLEG): container finished" podID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerID="530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8" exitCode=0 Dec 16 09:19:22 crc kubenswrapper[4789]: I1216 09:19:22.802119 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29rcd" event={"ID":"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c","Type":"ContainerDied","Data":"530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8"} Dec 16 09:19:22 crc kubenswrapper[4789]: I1216 09:19:22.807568 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" exitCode=0 Dec 16 09:19:22 crc kubenswrapper[4789]: I1216 09:19:22.807612 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8"} Dec 16 09:19:22 crc kubenswrapper[4789]: I1216 09:19:22.807648 4789 scope.go:117] "RemoveContainer" containerID="849ec36ca6ea5e90f98f29a11e0d3e94cc588787a9f4c7ff35d827cbbcbb4e10" Dec 16 09:19:22 crc kubenswrapper[4789]: I1216 09:19:22.808403 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:19:22 crc kubenswrapper[4789]: E1216 09:19:22.808702 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:19:23 crc kubenswrapper[4789]: I1216 09:19:23.822160 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29rcd" event={"ID":"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c","Type":"ContainerStarted","Data":"07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e"} Dec 16 09:19:23 crc kubenswrapper[4789]: I1216 09:19:23.845517 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-29rcd" podStartSLOduration=3.37244728 podStartE2EDuration="6.845492433s" podCreationTimestamp="2025-12-16 09:19:17 +0000 UTC" firstStartedPulling="2025-12-16 09:19:19.777977479 +0000 UTC m=+8898.039865118" lastFinishedPulling="2025-12-16 09:19:23.251022642 +0000 UTC m=+8901.512910271" observedRunningTime="2025-12-16 09:19:23.838877922 +0000 UTC m=+8902.100765551" watchObservedRunningTime="2025-12-16 09:19:23.845492433 +0000 UTC m=+8902.107380062" Dec 16 09:19:28 crc kubenswrapper[4789]: I1216 09:19:28.035243 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:28 crc kubenswrapper[4789]: I1216 09:19:28.036820 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:28 crc kubenswrapper[4789]: I1216 09:19:28.084374 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:29 crc kubenswrapper[4789]: I1216 09:19:29.109575 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:29 crc kubenswrapper[4789]: I1216 09:19:29.166195 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29rcd"] Dec 16 09:19:30 crc kubenswrapper[4789]: I1216 09:19:30.882985 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-29rcd" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="registry-server" containerID="cri-o://07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e" gracePeriod=2 Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.512279 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.567262 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-catalog-content\") pod \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.567532 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-utilities\") pod \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.567606 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-kube-api-access-sl6fx\") pod \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\" (UID: \"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c\") " Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.568424 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-utilities" (OuterVolumeSpecName: "utilities") pod "a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" (UID: "a09d40cd-6723-4e00-9c6e-2ccf3e709a7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.584228 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-kube-api-access-sl6fx" (OuterVolumeSpecName: "kube-api-access-sl6fx") pod "a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" (UID: "a09d40cd-6723-4e00-9c6e-2ccf3e709a7c"). InnerVolumeSpecName "kube-api-access-sl6fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.593521 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" (UID: "a09d40cd-6723-4e00-9c6e-2ccf3e709a7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.670828 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.670882 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.670895 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c-kube-api-access-sl6fx\") on node \"crc\" DevicePath \"\"" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.897463 4789 generic.go:334] "Generic (PLEG): container finished" podID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerID="07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e" exitCode=0 Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.897504 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29rcd" event={"ID":"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c","Type":"ContainerDied","Data":"07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e"} Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.897532 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29rcd" event={"ID":"a09d40cd-6723-4e00-9c6e-2ccf3e709a7c","Type":"ContainerDied","Data":"a8a9d1f4c5ac3d4469793f60765bc6b52cac13edf8816f232564ef0f36ada585"} Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.897553 4789 scope.go:117] "RemoveContainer" containerID="07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.897705 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29rcd" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.923416 4789 scope.go:117] "RemoveContainer" containerID="530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.948301 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29rcd"] Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.970993 4789 scope.go:117] "RemoveContainer" containerID="e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1" Dec 16 09:19:31 crc kubenswrapper[4789]: I1216 09:19:31.982374 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-29rcd"] Dec 16 09:19:32 crc kubenswrapper[4789]: I1216 09:19:32.027068 4789 scope.go:117] "RemoveContainer" containerID="07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e" Dec 16 09:19:32 crc kubenswrapper[4789]: E1216 09:19:32.027602 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e\": container with ID starting with 07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e not found: ID does not exist" containerID="07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e" Dec 16 09:19:32 crc kubenswrapper[4789]: I1216 09:19:32.027727 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e"} err="failed to get container status \"07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e\": rpc error: code = NotFound desc = could not find container \"07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e\": container with ID starting with 07d9aef57dd688ce7d2c39a1d64cdebf3e4cffe76dbc1728e445465d29d8572e not found: ID does not exist" Dec 16 09:19:32 crc kubenswrapper[4789]: I1216 09:19:32.027830 4789 scope.go:117] "RemoveContainer" containerID="530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8" Dec 16 09:19:32 crc kubenswrapper[4789]: E1216 09:19:32.028159 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8\": container with ID starting with 530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8 not found: ID does not exist" containerID="530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8" Dec 16 09:19:32 crc kubenswrapper[4789]: I1216 09:19:32.028257 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8"} err="failed to get container status \"530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8\": rpc error: code = NotFound desc = could not find container \"530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8\": container with ID starting with 530205a8002977bf76b590fd5ac6b65b4ae1e02db6f42a9861e828736f474fd8 not found: ID does not exist" Dec 16 09:19:32 crc kubenswrapper[4789]: I1216 09:19:32.028334 4789 scope.go:117] "RemoveContainer" containerID="e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1" Dec 16 09:19:32 crc kubenswrapper[4789]: E1216 09:19:32.028628 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1\": container with ID starting with e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1 not found: ID does not exist" containerID="e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1" Dec 16 09:19:32 crc kubenswrapper[4789]: I1216 09:19:32.028754 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1"} err="failed to get container status \"e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1\": rpc error: code = NotFound desc = could not find container \"e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1\": container with ID starting with e9ce845172e1070e1239694618fdd4c5a402971d7531cca0f0c9329f06373fe1 not found: ID does not exist" Dec 16 09:19:32 crc kubenswrapper[4789]: I1216 09:19:32.120308 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" path="/var/lib/kubelet/pods/a09d40cd-6723-4e00-9c6e-2ccf3e709a7c/volumes" Dec 16 09:19:34 crc kubenswrapper[4789]: I1216 09:19:34.106998 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:19:34 crc kubenswrapper[4789]: E1216 09:19:34.108850 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:19:46 crc kubenswrapper[4789]: I1216 09:19:46.105967 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:19:46 crc kubenswrapper[4789]: E1216 09:19:46.106783 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:19:59 crc kubenswrapper[4789]: I1216 09:19:59.104938 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:19:59 crc kubenswrapper[4789]: E1216 09:19:59.107016 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:20:13 crc kubenswrapper[4789]: I1216 09:20:13.105130 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:20:13 crc kubenswrapper[4789]: E1216 09:20:13.105805 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:20:25 crc kubenswrapper[4789]: I1216 09:20:25.105211 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:20:25 crc kubenswrapper[4789]: E1216 09:20:25.106127 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:20:37 crc kubenswrapper[4789]: I1216 09:20:37.105723 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:20:37 crc kubenswrapper[4789]: E1216 09:20:37.106711 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:20:50 crc kubenswrapper[4789]: I1216 09:20:50.110470 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:20:50 crc kubenswrapper[4789]: E1216 09:20:50.111339 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:21:02 crc kubenswrapper[4789]: I1216 09:21:02.118436 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:21:02 crc kubenswrapper[4789]: E1216 09:21:02.119261 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:21:14 crc kubenswrapper[4789]: I1216 09:21:14.104797 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:21:14 crc kubenswrapper[4789]: E1216 09:21:14.105701 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:21:28 crc kubenswrapper[4789]: I1216 09:21:28.105259 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:21:28 crc kubenswrapper[4789]: E1216 09:21:28.106069 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:21:43 crc kubenswrapper[4789]: I1216 09:21:43.105405 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:21:43 crc kubenswrapper[4789]: E1216 09:21:43.106311 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:21:55 crc kubenswrapper[4789]: I1216 09:21:55.107069 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:21:55 crc kubenswrapper[4789]: E1216 09:21:55.107604 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.803031 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzwfg"] Dec 16 09:22:06 crc kubenswrapper[4789]: E1216 09:22:06.804109 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="extract-content" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.804125 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="extract-content" Dec 16 09:22:06 crc kubenswrapper[4789]: E1216 09:22:06.804156 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="extract-utilities" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.804164 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="extract-utilities" Dec 16 09:22:06 crc kubenswrapper[4789]: E1216 09:22:06.804191 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="registry-server" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.804199 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="registry-server" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.804440 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09d40cd-6723-4e00-9c6e-2ccf3e709a7c" containerName="registry-server" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.806511 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.835732 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzwfg"] Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.952837 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-catalog-content\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.952908 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrrdp\" (UniqueName: \"kubernetes.io/projected/39b2655c-2c5e-4906-81f2-c884f99c457a-kube-api-access-hrrdp\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:06 crc kubenswrapper[4789]: I1216 09:22:06.953258 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-utilities\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.055030 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-catalog-content\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.055161 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrrdp\" (UniqueName: \"kubernetes.io/projected/39b2655c-2c5e-4906-81f2-c884f99c457a-kube-api-access-hrrdp\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.055276 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-utilities\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.055957 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-utilities\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.056220 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-catalog-content\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.082499 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrrdp\" (UniqueName: \"kubernetes.io/projected/39b2655c-2c5e-4906-81f2-c884f99c457a-kube-api-access-hrrdp\") pod \"redhat-operators-dzwfg\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.131323 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:07 crc kubenswrapper[4789]: I1216 09:22:07.591496 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzwfg"] Dec 16 09:22:08 crc kubenswrapper[4789]: I1216 09:22:08.105478 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:22:08 crc kubenswrapper[4789]: E1216 09:22:08.105707 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:22:08 crc kubenswrapper[4789]: I1216 09:22:08.399151 4789 generic.go:334] "Generic (PLEG): container finished" podID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerID="e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad" exitCode=0 Dec 16 09:22:08 crc kubenswrapper[4789]: I1216 09:22:08.399262 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzwfg" event={"ID":"39b2655c-2c5e-4906-81f2-c884f99c457a","Type":"ContainerDied","Data":"e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad"} Dec 16 09:22:08 crc kubenswrapper[4789]: I1216 09:22:08.399465 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzwfg" event={"ID":"39b2655c-2c5e-4906-81f2-c884f99c457a","Type":"ContainerStarted","Data":"7b32502eafd200f2462ba516ba9ddf1ae026f43606cacb95541a47d2a7f03efc"} Dec 16 09:22:10 crc kubenswrapper[4789]: I1216 09:22:10.419467 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzwfg" event={"ID":"39b2655c-2c5e-4906-81f2-c884f99c457a","Type":"ContainerStarted","Data":"d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158"} Dec 16 09:22:13 crc kubenswrapper[4789]: I1216 09:22:13.461274 4789 generic.go:334] "Generic (PLEG): container finished" podID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerID="d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158" exitCode=0 Dec 16 09:22:13 crc kubenswrapper[4789]: I1216 09:22:13.461375 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzwfg" event={"ID":"39b2655c-2c5e-4906-81f2-c884f99c457a","Type":"ContainerDied","Data":"d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158"} Dec 16 09:22:15 crc kubenswrapper[4789]: I1216 09:22:15.482196 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzwfg" event={"ID":"39b2655c-2c5e-4906-81f2-c884f99c457a","Type":"ContainerStarted","Data":"ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e"} Dec 16 09:22:15 crc kubenswrapper[4789]: I1216 09:22:15.512244 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzwfg" podStartSLOduration=3.922478872 podStartE2EDuration="9.512221351s" podCreationTimestamp="2025-12-16 09:22:06 +0000 UTC" firstStartedPulling="2025-12-16 09:22:08.401185422 +0000 UTC m=+9066.663073051" lastFinishedPulling="2025-12-16 09:22:13.990927901 +0000 UTC m=+9072.252815530" observedRunningTime="2025-12-16 09:22:15.500530047 +0000 UTC m=+9073.762417676" watchObservedRunningTime="2025-12-16 09:22:15.512221351 +0000 UTC m=+9073.774108980" Dec 16 09:22:17 crc kubenswrapper[4789]: I1216 09:22:17.132196 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:17 crc kubenswrapper[4789]: I1216 09:22:17.133333 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:18 crc kubenswrapper[4789]: I1216 09:22:18.191990 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzwfg" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="registry-server" probeResult="failure" output=< Dec 16 09:22:18 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 09:22:18 crc kubenswrapper[4789]: > Dec 16 09:22:19 crc kubenswrapper[4789]: I1216 09:22:19.105899 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:22:19 crc kubenswrapper[4789]: E1216 09:22:19.106558 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:22:27 crc kubenswrapper[4789]: I1216 09:22:27.437830 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:27 crc kubenswrapper[4789]: I1216 09:22:27.486666 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:27 crc kubenswrapper[4789]: I1216 09:22:27.683667 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzwfg"] Dec 16 09:22:28 crc kubenswrapper[4789]: I1216 09:22:28.601413 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzwfg" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="registry-server" containerID="cri-o://ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e" gracePeriod=2 Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.290548 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.334592 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-utilities\") pod \"39b2655c-2c5e-4906-81f2-c884f99c457a\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.334648 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-catalog-content\") pod \"39b2655c-2c5e-4906-81f2-c884f99c457a\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.334899 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrrdp\" (UniqueName: \"kubernetes.io/projected/39b2655c-2c5e-4906-81f2-c884f99c457a-kube-api-access-hrrdp\") pod \"39b2655c-2c5e-4906-81f2-c884f99c457a\" (UID: \"39b2655c-2c5e-4906-81f2-c884f99c457a\") " Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.335797 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-utilities" (OuterVolumeSpecName: "utilities") pod "39b2655c-2c5e-4906-81f2-c884f99c457a" (UID: "39b2655c-2c5e-4906-81f2-c884f99c457a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.349082 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b2655c-2c5e-4906-81f2-c884f99c457a-kube-api-access-hrrdp" (OuterVolumeSpecName: "kube-api-access-hrrdp") pod "39b2655c-2c5e-4906-81f2-c884f99c457a" (UID: "39b2655c-2c5e-4906-81f2-c884f99c457a"). InnerVolumeSpecName "kube-api-access-hrrdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.437633 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrrdp\" (UniqueName: \"kubernetes.io/projected/39b2655c-2c5e-4906-81f2-c884f99c457a-kube-api-access-hrrdp\") on node \"crc\" DevicePath \"\"" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.437676 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.451818 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39b2655c-2c5e-4906-81f2-c884f99c457a" (UID: "39b2655c-2c5e-4906-81f2-c884f99c457a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.540972 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39b2655c-2c5e-4906-81f2-c884f99c457a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.612746 4789 generic.go:334] "Generic (PLEG): container finished" podID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerID="ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e" exitCode=0 Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.612798 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzwfg" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.612816 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzwfg" event={"ID":"39b2655c-2c5e-4906-81f2-c884f99c457a","Type":"ContainerDied","Data":"ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e"} Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.612861 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzwfg" event={"ID":"39b2655c-2c5e-4906-81f2-c884f99c457a","Type":"ContainerDied","Data":"7b32502eafd200f2462ba516ba9ddf1ae026f43606cacb95541a47d2a7f03efc"} Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.612879 4789 scope.go:117] "RemoveContainer" containerID="ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.632516 4789 scope.go:117] "RemoveContainer" containerID="d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.645527 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzwfg"] Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.654439 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzwfg"] Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.663697 4789 scope.go:117] "RemoveContainer" containerID="e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.703795 4789 scope.go:117] "RemoveContainer" containerID="ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e" Dec 16 09:22:29 crc kubenswrapper[4789]: E1216 09:22:29.704375 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e\": container with ID starting with ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e not found: ID does not exist" containerID="ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.704437 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e"} err="failed to get container status \"ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e\": rpc error: code = NotFound desc = could not find container \"ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e\": container with ID starting with ce25535a20c6aeae1bf4bf57fc4f54118fd75d071a821591b86388d2e8a78d5e not found: ID does not exist" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.704472 4789 scope.go:117] "RemoveContainer" containerID="d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158" Dec 16 09:22:29 crc kubenswrapper[4789]: E1216 09:22:29.704875 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158\": container with ID starting with d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158 not found: ID does not exist" containerID="d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.704905 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158"} err="failed to get container status \"d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158\": rpc error: code = NotFound desc = could not find container \"d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158\": container with ID starting with d3eb41fb59b5509694f4fedfca6ea9a80e839a4a23b9aa402390e79839fbd158 not found: ID does not exist" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.704940 4789 scope.go:117] "RemoveContainer" containerID="e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad" Dec 16 09:22:29 crc kubenswrapper[4789]: E1216 09:22:29.705236 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad\": container with ID starting with e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad not found: ID does not exist" containerID="e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad" Dec 16 09:22:29 crc kubenswrapper[4789]: I1216 09:22:29.705273 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad"} err="failed to get container status \"e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad\": rpc error: code = NotFound desc = could not find container \"e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad\": container with ID starting with e34f33ce40db6932736db87fe3585905c43ab4de2c9b5f410d3b7ebdebcca4ad not found: ID does not exist" Dec 16 09:22:30 crc kubenswrapper[4789]: I1216 09:22:30.119069 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" path="/var/lib/kubelet/pods/39b2655c-2c5e-4906-81f2-c884f99c457a/volumes" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.086276 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zqmrw"] Dec 16 09:22:32 crc kubenswrapper[4789]: E1216 09:22:32.088570 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="extract-content" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.088599 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="extract-content" Dec 16 09:22:32 crc kubenswrapper[4789]: E1216 09:22:32.088617 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="extract-utilities" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.088624 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="extract-utilities" Dec 16 09:22:32 crc kubenswrapper[4789]: E1216 09:22:32.088656 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="registry-server" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.088662 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="registry-server" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.088853 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b2655c-2c5e-4906-81f2-c884f99c457a" containerName="registry-server" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.091044 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.108450 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:22:32 crc kubenswrapper[4789]: E1216 09:22:32.108660 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.137508 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zqmrw"] Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.194187 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-catalog-content\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.194267 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-utilities\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.194305 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v7ct\" (UniqueName: \"kubernetes.io/projected/c745c97d-a1cd-42b6-ab36-36827fe73bd5-kube-api-access-5v7ct\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.296032 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v7ct\" (UniqueName: \"kubernetes.io/projected/c745c97d-a1cd-42b6-ab36-36827fe73bd5-kube-api-access-5v7ct\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.296515 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-catalog-content\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.296642 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-utilities\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.297108 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-utilities\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.297397 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-catalog-content\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.318315 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v7ct\" (UniqueName: \"kubernetes.io/projected/c745c97d-a1cd-42b6-ab36-36827fe73bd5-kube-api-access-5v7ct\") pod \"certified-operators-zqmrw\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:32 crc kubenswrapper[4789]: I1216 09:22:32.418131 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:33 crc kubenswrapper[4789]: I1216 09:22:33.014982 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zqmrw"] Dec 16 09:22:33 crc kubenswrapper[4789]: I1216 09:22:33.663058 4789 generic.go:334] "Generic (PLEG): container finished" podID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerID="190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d" exitCode=0 Dec 16 09:22:33 crc kubenswrapper[4789]: I1216 09:22:33.663159 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqmrw" event={"ID":"c745c97d-a1cd-42b6-ab36-36827fe73bd5","Type":"ContainerDied","Data":"190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d"} Dec 16 09:22:33 crc kubenswrapper[4789]: I1216 09:22:33.663402 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqmrw" event={"ID":"c745c97d-a1cd-42b6-ab36-36827fe73bd5","Type":"ContainerStarted","Data":"aebf8af2a001e50009970dd1678c05e3f72d9495eae0d613f0a3890210451762"} Dec 16 09:22:33 crc kubenswrapper[4789]: I1216 09:22:33.666006 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:22:35 crc kubenswrapper[4789]: I1216 09:22:35.686692 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqmrw" event={"ID":"c745c97d-a1cd-42b6-ab36-36827fe73bd5","Type":"ContainerStarted","Data":"8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035"} Dec 16 09:22:36 crc kubenswrapper[4789]: I1216 09:22:36.698645 4789 generic.go:334] "Generic (PLEG): container finished" podID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerID="8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035" exitCode=0 Dec 16 09:22:36 crc kubenswrapper[4789]: I1216 09:22:36.698704 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqmrw" event={"ID":"c745c97d-a1cd-42b6-ab36-36827fe73bd5","Type":"ContainerDied","Data":"8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035"} Dec 16 09:22:38 crc kubenswrapper[4789]: I1216 09:22:38.720662 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqmrw" event={"ID":"c745c97d-a1cd-42b6-ab36-36827fe73bd5","Type":"ContainerStarted","Data":"84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c"} Dec 16 09:22:38 crc kubenswrapper[4789]: I1216 09:22:38.740371 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zqmrw" podStartSLOduration=2.896435142 podStartE2EDuration="6.740355573s" podCreationTimestamp="2025-12-16 09:22:32 +0000 UTC" firstStartedPulling="2025-12-16 09:22:33.665707039 +0000 UTC m=+9091.927594668" lastFinishedPulling="2025-12-16 09:22:37.50962747 +0000 UTC m=+9095.771515099" observedRunningTime="2025-12-16 09:22:38.739349298 +0000 UTC m=+9097.001236937" watchObservedRunningTime="2025-12-16 09:22:38.740355573 +0000 UTC m=+9097.002243202" Dec 16 09:22:42 crc kubenswrapper[4789]: I1216 09:22:42.419173 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:42 crc kubenswrapper[4789]: I1216 09:22:42.419730 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:42 crc kubenswrapper[4789]: I1216 09:22:42.471820 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:42 crc kubenswrapper[4789]: I1216 09:22:42.810843 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:42 crc kubenswrapper[4789]: I1216 09:22:42.863835 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zqmrw"] Dec 16 09:22:44 crc kubenswrapper[4789]: I1216 09:22:44.104607 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:22:44 crc kubenswrapper[4789]: E1216 09:22:44.106079 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:22:44 crc kubenswrapper[4789]: I1216 09:22:44.778736 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zqmrw" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="registry-server" containerID="cri-o://84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c" gracePeriod=2 Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.447720 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.488714 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-utilities\") pod \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.488795 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-catalog-content\") pod \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.489010 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v7ct\" (UniqueName: \"kubernetes.io/projected/c745c97d-a1cd-42b6-ab36-36827fe73bd5-kube-api-access-5v7ct\") pod \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\" (UID: \"c745c97d-a1cd-42b6-ab36-36827fe73bd5\") " Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.489945 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-utilities" (OuterVolumeSpecName: "utilities") pod "c745c97d-a1cd-42b6-ab36-36827fe73bd5" (UID: "c745c97d-a1cd-42b6-ab36-36827fe73bd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.496175 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c745c97d-a1cd-42b6-ab36-36827fe73bd5-kube-api-access-5v7ct" (OuterVolumeSpecName: "kube-api-access-5v7ct") pod "c745c97d-a1cd-42b6-ab36-36827fe73bd5" (UID: "c745c97d-a1cd-42b6-ab36-36827fe73bd5"). InnerVolumeSpecName "kube-api-access-5v7ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.600756 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.600786 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v7ct\" (UniqueName: \"kubernetes.io/projected/c745c97d-a1cd-42b6-ab36-36827fe73bd5-kube-api-access-5v7ct\") on node \"crc\" DevicePath \"\"" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.697539 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c745c97d-a1cd-42b6-ab36-36827fe73bd5" (UID: "c745c97d-a1cd-42b6-ab36-36827fe73bd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.703024 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c745c97d-a1cd-42b6-ab36-36827fe73bd5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.792667 4789 generic.go:334] "Generic (PLEG): container finished" podID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerID="84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c" exitCode=0 Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.793230 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zqmrw" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.793249 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqmrw" event={"ID":"c745c97d-a1cd-42b6-ab36-36827fe73bd5","Type":"ContainerDied","Data":"84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c"} Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.794030 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zqmrw" event={"ID":"c745c97d-a1cd-42b6-ab36-36827fe73bd5","Type":"ContainerDied","Data":"aebf8af2a001e50009970dd1678c05e3f72d9495eae0d613f0a3890210451762"} Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.794093 4789 scope.go:117] "RemoveContainer" containerID="84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.818897 4789 scope.go:117] "RemoveContainer" containerID="8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.839072 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zqmrw"] Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.856118 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zqmrw"] Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.869535 4789 scope.go:117] "RemoveContainer" containerID="190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.892059 4789 scope.go:117] "RemoveContainer" containerID="84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c" Dec 16 09:22:45 crc kubenswrapper[4789]: E1216 09:22:45.893295 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c\": container with ID starting with 84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c not found: ID does not exist" containerID="84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.893409 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c"} err="failed to get container status \"84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c\": rpc error: code = NotFound desc = could not find container \"84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c\": container with ID starting with 84e6d4a5279118e4c122ed28682fc9ca364d9f8dd034cac3d03b1bc4c5cbe95c not found: ID does not exist" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.893506 4789 scope.go:117] "RemoveContainer" containerID="8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035" Dec 16 09:22:45 crc kubenswrapper[4789]: E1216 09:22:45.893930 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035\": container with ID starting with 8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035 not found: ID does not exist" containerID="8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.894018 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035"} err="failed to get container status \"8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035\": rpc error: code = NotFound desc = could not find container \"8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035\": container with ID starting with 8eee6204f6784a83538526a5eb73fa7bcc6bf397b5949bb5b1ac265c84340035 not found: ID does not exist" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.894096 4789 scope.go:117] "RemoveContainer" containerID="190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d" Dec 16 09:22:45 crc kubenswrapper[4789]: E1216 09:22:45.894447 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d\": container with ID starting with 190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d not found: ID does not exist" containerID="190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d" Dec 16 09:22:45 crc kubenswrapper[4789]: I1216 09:22:45.894476 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d"} err="failed to get container status \"190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d\": rpc error: code = NotFound desc = could not find container \"190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d\": container with ID starting with 190bd66506a4a124062cf0e84db121615d15f99c674fd6ddded7635cbc41ca4d not found: ID does not exist" Dec 16 09:22:46 crc kubenswrapper[4789]: I1216 09:22:46.119135 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" path="/var/lib/kubelet/pods/c745c97d-a1cd-42b6-ab36-36827fe73bd5/volumes" Dec 16 09:22:57 crc kubenswrapper[4789]: I1216 09:22:57.106125 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:22:57 crc kubenswrapper[4789]: E1216 09:22:57.107285 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:23:08 crc kubenswrapper[4789]: I1216 09:23:08.105471 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:23:08 crc kubenswrapper[4789]: E1216 09:23:08.106232 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:23:23 crc kubenswrapper[4789]: I1216 09:23:23.104863 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:23:23 crc kubenswrapper[4789]: E1216 09:23:23.105446 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:23:36 crc kubenswrapper[4789]: I1216 09:23:36.105500 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:23:36 crc kubenswrapper[4789]: E1216 09:23:36.106467 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:23:47 crc kubenswrapper[4789]: I1216 09:23:47.106862 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:23:47 crc kubenswrapper[4789]: E1216 09:23:47.107729 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:24:01 crc kubenswrapper[4789]: I1216 09:24:01.105287 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:24:01 crc kubenswrapper[4789]: E1216 09:24:01.106076 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:24:13 crc kubenswrapper[4789]: I1216 09:24:13.104762 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:24:13 crc kubenswrapper[4789]: E1216 09:24:13.105785 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:24:27 crc kubenswrapper[4789]: I1216 09:24:27.105134 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:24:27 crc kubenswrapper[4789]: I1216 09:24:27.806733 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"64dd59044e7fc6d9f319f7b4fd3998634b32358794abdb51b6987acb14b1ba40"} Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.144966 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wjnfr"] Dec 16 09:24:36 crc kubenswrapper[4789]: E1216 09:24:36.150890 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="extract-utilities" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.150943 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="extract-utilities" Dec 16 09:24:36 crc kubenswrapper[4789]: E1216 09:24:36.150967 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="registry-server" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.150976 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="registry-server" Dec 16 09:24:36 crc kubenswrapper[4789]: E1216 09:24:36.151013 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="extract-content" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.151019 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="extract-content" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.151213 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="c745c97d-a1cd-42b6-ab36-36827fe73bd5" containerName="registry-server" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.153039 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.165139 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wjnfr"] Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.199127 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfx24\" (UniqueName: \"kubernetes.io/projected/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-kube-api-access-hfx24\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.199240 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-utilities\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.200510 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-catalog-content\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.302108 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfx24\" (UniqueName: \"kubernetes.io/projected/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-kube-api-access-hfx24\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.302453 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-utilities\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.302527 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-catalog-content\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.303050 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-utilities\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.303089 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-catalog-content\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.324876 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfx24\" (UniqueName: \"kubernetes.io/projected/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-kube-api-access-hfx24\") pod \"community-operators-wjnfr\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:36 crc kubenswrapper[4789]: I1216 09:24:36.476789 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:37 crc kubenswrapper[4789]: I1216 09:24:37.035371 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wjnfr"] Dec 16 09:24:37 crc kubenswrapper[4789]: I1216 09:24:37.914309 4789 generic.go:334] "Generic (PLEG): container finished" podID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerID="f221bf446a4e66b91d7590201a80937c05d13d4be49838f61b16a027f239062f" exitCode=0 Dec 16 09:24:37 crc kubenswrapper[4789]: I1216 09:24:37.914410 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjnfr" event={"ID":"537f0ee7-20a7-40ef-89e8-28e9676fd4fa","Type":"ContainerDied","Data":"f221bf446a4e66b91d7590201a80937c05d13d4be49838f61b16a027f239062f"} Dec 16 09:24:37 crc kubenswrapper[4789]: I1216 09:24:37.915812 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjnfr" event={"ID":"537f0ee7-20a7-40ef-89e8-28e9676fd4fa","Type":"ContainerStarted","Data":"87d2ece69dc47ae31943e70f562e00ad7e287949b898a645f798cd1c5d5df831"} Dec 16 09:24:38 crc kubenswrapper[4789]: I1216 09:24:38.927776 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjnfr" event={"ID":"537f0ee7-20a7-40ef-89e8-28e9676fd4fa","Type":"ContainerStarted","Data":"09cec9e972a3abf014aa4a336b96b29716635bc1781048c2c1a50942c52e5083"} Dec 16 09:24:39 crc kubenswrapper[4789]: I1216 09:24:39.955810 4789 generic.go:334] "Generic (PLEG): container finished" podID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerID="09cec9e972a3abf014aa4a336b96b29716635bc1781048c2c1a50942c52e5083" exitCode=0 Dec 16 09:24:39 crc kubenswrapper[4789]: I1216 09:24:39.956027 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjnfr" event={"ID":"537f0ee7-20a7-40ef-89e8-28e9676fd4fa","Type":"ContainerDied","Data":"09cec9e972a3abf014aa4a336b96b29716635bc1781048c2c1a50942c52e5083"} Dec 16 09:24:40 crc kubenswrapper[4789]: I1216 09:24:40.968143 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjnfr" event={"ID":"537f0ee7-20a7-40ef-89e8-28e9676fd4fa","Type":"ContainerStarted","Data":"251300adaf1932159f9f873a4dda3087906c21ab7bbff92d5ce4402742ce0203"} Dec 16 09:24:40 crc kubenswrapper[4789]: I1216 09:24:40.993195 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wjnfr" podStartSLOduration=2.510959547 podStartE2EDuration="4.99317143s" podCreationTimestamp="2025-12-16 09:24:36 +0000 UTC" firstStartedPulling="2025-12-16 09:24:37.91627235 +0000 UTC m=+9216.178159979" lastFinishedPulling="2025-12-16 09:24:40.398484223 +0000 UTC m=+9218.660371862" observedRunningTime="2025-12-16 09:24:40.986649791 +0000 UTC m=+9219.248537420" watchObservedRunningTime="2025-12-16 09:24:40.99317143 +0000 UTC m=+9219.255059059" Dec 16 09:24:46 crc kubenswrapper[4789]: I1216 09:24:46.477425 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:46 crc kubenswrapper[4789]: I1216 09:24:46.478311 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:46 crc kubenswrapper[4789]: I1216 09:24:46.530512 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:47 crc kubenswrapper[4789]: I1216 09:24:47.602737 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:47 crc kubenswrapper[4789]: I1216 09:24:47.647971 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wjnfr"] Dec 16 09:24:49 crc kubenswrapper[4789]: I1216 09:24:49.053543 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wjnfr" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="registry-server" containerID="cri-o://251300adaf1932159f9f873a4dda3087906c21ab7bbff92d5ce4402742ce0203" gracePeriod=2 Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.065047 4789 generic.go:334] "Generic (PLEG): container finished" podID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerID="251300adaf1932159f9f873a4dda3087906c21ab7bbff92d5ce4402742ce0203" exitCode=0 Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.065178 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjnfr" event={"ID":"537f0ee7-20a7-40ef-89e8-28e9676fd4fa","Type":"ContainerDied","Data":"251300adaf1932159f9f873a4dda3087906c21ab7bbff92d5ce4402742ce0203"} Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.066145 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjnfr" event={"ID":"537f0ee7-20a7-40ef-89e8-28e9676fd4fa","Type":"ContainerDied","Data":"87d2ece69dc47ae31943e70f562e00ad7e287949b898a645f798cd1c5d5df831"} Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.066181 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87d2ece69dc47ae31943e70f562e00ad7e287949b898a645f798cd1c5d5df831" Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.155386 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.211383 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-catalog-content\") pod \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.211457 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfx24\" (UniqueName: \"kubernetes.io/projected/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-kube-api-access-hfx24\") pod \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.211516 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-utilities\") pod \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\" (UID: \"537f0ee7-20a7-40ef-89e8-28e9676fd4fa\") " Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.212525 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-utilities" (OuterVolumeSpecName: "utilities") pod "537f0ee7-20a7-40ef-89e8-28e9676fd4fa" (UID: "537f0ee7-20a7-40ef-89e8-28e9676fd4fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.221110 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-kube-api-access-hfx24" (OuterVolumeSpecName: "kube-api-access-hfx24") pod "537f0ee7-20a7-40ef-89e8-28e9676fd4fa" (UID: "537f0ee7-20a7-40ef-89e8-28e9676fd4fa"). InnerVolumeSpecName "kube-api-access-hfx24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.270322 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "537f0ee7-20a7-40ef-89e8-28e9676fd4fa" (UID: "537f0ee7-20a7-40ef-89e8-28e9676fd4fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.314528 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.314576 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfx24\" (UniqueName: \"kubernetes.io/projected/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-kube-api-access-hfx24\") on node \"crc\" DevicePath \"\"" Dec 16 09:24:50 crc kubenswrapper[4789]: I1216 09:24:50.314590 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537f0ee7-20a7-40ef-89e8-28e9676fd4fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:24:51 crc kubenswrapper[4789]: I1216 09:24:51.076139 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjnfr" Dec 16 09:24:51 crc kubenswrapper[4789]: I1216 09:24:51.115958 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wjnfr"] Dec 16 09:24:51 crc kubenswrapper[4789]: I1216 09:24:51.128197 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wjnfr"] Dec 16 09:24:52 crc kubenswrapper[4789]: I1216 09:24:52.116393 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" path="/var/lib/kubelet/pods/537f0ee7-20a7-40ef-89e8-28e9676fd4fa/volumes" Dec 16 09:26:51 crc kubenswrapper[4789]: I1216 09:26:51.928049 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:26:51 crc kubenswrapper[4789]: I1216 09:26:51.928640 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:27:21 crc kubenswrapper[4789]: I1216 09:27:21.928151 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:27:21 crc kubenswrapper[4789]: I1216 09:27:21.928781 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:27:38 crc kubenswrapper[4789]: I1216 09:27:38.629968 4789 generic.go:334] "Generic (PLEG): container finished" podID="ea4afd4b-996e-4079-83b9-f2c3e8242de1" containerID="d4ae0350b5e034c789ccc896e628f89eca6511fadd8f92932d7d7c5141a16a29" exitCode=0 Dec 16 09:27:38 crc kubenswrapper[4789]: I1216 09:27:38.630051 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ea4afd4b-996e-4079-83b9-f2c3e8242de1","Type":"ContainerDied","Data":"d4ae0350b5e034c789ccc896e628f89eca6511fadd8f92932d7d7c5141a16a29"} Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.124540 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.270650 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.270810 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ssh-key\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.270829 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ca-certs\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.270908 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.270950 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk4pp\" (UniqueName: \"kubernetes.io/projected/ea4afd4b-996e-4079-83b9-f2c3e8242de1-kube-api-access-bk4pp\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.270969 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-workdir\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.271030 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config-secret\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.271090 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-temporary\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.271113 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-config-data\") pod \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\" (UID: \"ea4afd4b-996e-4079-83b9-f2c3e8242de1\") " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.271840 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.272244 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-config-data" (OuterVolumeSpecName: "config-data") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.276880 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4afd4b-996e-4079-83b9-f2c3e8242de1-kube-api-access-bk4pp" (OuterVolumeSpecName: "kube-api-access-bk4pp") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "kube-api-access-bk4pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.279631 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.290984 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.312031 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.318167 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.318692 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.338027 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ea4afd4b-996e-4079-83b9-f2c3e8242de1" (UID: "ea4afd4b-996e-4079-83b9-f2c3e8242de1"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374549 4789 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374604 4789 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374649 4789 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374661 4789 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374673 4789 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374685 4789 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374699 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk4pp\" (UniqueName: \"kubernetes.io/projected/ea4afd4b-996e-4079-83b9-f2c3e8242de1-kube-api-access-bk4pp\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374712 4789 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ea4afd4b-996e-4079-83b9-f2c3e8242de1-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.374724 4789 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea4afd4b-996e-4079-83b9-f2c3e8242de1-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.399528 4789 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.476296 4789 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.650520 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ea4afd4b-996e-4079-83b9-f2c3e8242de1","Type":"ContainerDied","Data":"9dd5e1d3eace1ed8535479f65c99575666105fc731ef0afa94e7bd175033564b"} Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.650611 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd5e1d3eace1ed8535479f65c99575666105fc731ef0afa94e7bd175033564b" Dec 16 09:27:40 crc kubenswrapper[4789]: I1216 09:27:40.650639 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 09:27:51 crc kubenswrapper[4789]: I1216 09:27:51.928391 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:27:51 crc kubenswrapper[4789]: I1216 09:27:51.929016 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:27:51 crc kubenswrapper[4789]: I1216 09:27:51.929060 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 09:27:51 crc kubenswrapper[4789]: I1216 09:27:51.929855 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64dd59044e7fc6d9f319f7b4fd3998634b32358794abdb51b6987acb14b1ba40"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:27:51 crc kubenswrapper[4789]: I1216 09:27:51.929934 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://64dd59044e7fc6d9f319f7b4fd3998634b32358794abdb51b6987acb14b1ba40" gracePeriod=600 Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.422454 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 09:27:52 crc kubenswrapper[4789]: E1216 09:27:52.423432 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="extract-content" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.423449 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="extract-content" Dec 16 09:27:52 crc kubenswrapper[4789]: E1216 09:27:52.423481 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="extract-utilities" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.423488 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="extract-utilities" Dec 16 09:27:52 crc kubenswrapper[4789]: E1216 09:27:52.423502 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4afd4b-996e-4079-83b9-f2c3e8242de1" containerName="tempest-tests-tempest-tests-runner" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.423508 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4afd4b-996e-4079-83b9-f2c3e8242de1" containerName="tempest-tests-tempest-tests-runner" Dec 16 09:27:52 crc kubenswrapper[4789]: E1216 09:27:52.423523 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="registry-server" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.423530 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="registry-server" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.423946 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="537f0ee7-20a7-40ef-89e8-28e9676fd4fa" containerName="registry-server" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.423972 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4afd4b-996e-4079-83b9-f2c3e8242de1" containerName="tempest-tests-tempest-tests-runner" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.425107 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.427092 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-lsf8v" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.440580 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.512367 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c40d0045-a36a-4ad8-bb95-24d7f2f02230\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.512433 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqc9\" (UniqueName: \"kubernetes.io/projected/c40d0045-a36a-4ad8-bb95-24d7f2f02230-kube-api-access-xkqc9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c40d0045-a36a-4ad8-bb95-24d7f2f02230\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.613994 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c40d0045-a36a-4ad8-bb95-24d7f2f02230\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.614331 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqc9\" (UniqueName: \"kubernetes.io/projected/c40d0045-a36a-4ad8-bb95-24d7f2f02230-kube-api-access-xkqc9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c40d0045-a36a-4ad8-bb95-24d7f2f02230\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.614448 4789 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c40d0045-a36a-4ad8-bb95-24d7f2f02230\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.632820 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqc9\" (UniqueName: \"kubernetes.io/projected/c40d0045-a36a-4ad8-bb95-24d7f2f02230-kube-api-access-xkqc9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c40d0045-a36a-4ad8-bb95-24d7f2f02230\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.648407 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c40d0045-a36a-4ad8-bb95-24d7f2f02230\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.746058 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.770579 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="64dd59044e7fc6d9f319f7b4fd3998634b32358794abdb51b6987acb14b1ba40" exitCode=0 Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.770655 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"64dd59044e7fc6d9f319f7b4fd3998634b32358794abdb51b6987acb14b1ba40"} Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.771019 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc"} Dec 16 09:27:52 crc kubenswrapper[4789]: I1216 09:27:52.771051 4789 scope.go:117] "RemoveContainer" containerID="3a3c6275770a897a6b018878d276477d8aa851f95703ba38b57bc1cafb2f3cb8" Dec 16 09:27:53 crc kubenswrapper[4789]: W1216 09:27:53.304513 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40d0045_a36a_4ad8_bb95_24d7f2f02230.slice/crio-3afe8d9224c9a5a0940a80a3a9bf27842387829bdfb9160a3d2df61383916da2 WatchSource:0}: Error finding container 3afe8d9224c9a5a0940a80a3a9bf27842387829bdfb9160a3d2df61383916da2: Status 404 returned error can't find the container with id 3afe8d9224c9a5a0940a80a3a9bf27842387829bdfb9160a3d2df61383916da2 Dec 16 09:27:53 crc kubenswrapper[4789]: I1216 09:27:53.307349 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:27:53 crc kubenswrapper[4789]: I1216 09:27:53.312852 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 09:27:53 crc kubenswrapper[4789]: I1216 09:27:53.795256 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c40d0045-a36a-4ad8-bb95-24d7f2f02230","Type":"ContainerStarted","Data":"3afe8d9224c9a5a0940a80a3a9bf27842387829bdfb9160a3d2df61383916da2"} Dec 16 09:27:56 crc kubenswrapper[4789]: I1216 09:27:56.849629 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c40d0045-a36a-4ad8-bb95-24d7f2f02230","Type":"ContainerStarted","Data":"24e5cd58f1265967511024c2fd1ee49c2cb3e9886d7b2d5a36f7c50f2688bfde"} Dec 16 09:27:56 crc kubenswrapper[4789]: I1216 09:27:56.864314 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.743501773 podStartE2EDuration="4.86429717s" podCreationTimestamp="2025-12-16 09:27:52 +0000 UTC" firstStartedPulling="2025-12-16 09:27:53.30714277 +0000 UTC m=+9411.569030399" lastFinishedPulling="2025-12-16 09:27:55.427938167 +0000 UTC m=+9413.689825796" observedRunningTime="2025-12-16 09:27:56.863403199 +0000 UTC m=+9415.125290828" watchObservedRunningTime="2025-12-16 09:27:56.86429717 +0000 UTC m=+9415.126184799" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.028825 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vpf4l/must-gather-kz8nt"] Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.031634 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.036255 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vpf4l"/"kube-root-ca.crt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.036504 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vpf4l"/"openshift-service-ca.crt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.056042 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vpf4l/must-gather-kz8nt"] Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.081849 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmszc\" (UniqueName: \"kubernetes.io/projected/3f332980-1129-4e30-a022-8ab9019c060b-kube-api-access-gmszc\") pod \"must-gather-kz8nt\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.082049 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f332980-1129-4e30-a022-8ab9019c060b-must-gather-output\") pod \"must-gather-kz8nt\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.183548 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f332980-1129-4e30-a022-8ab9019c060b-must-gather-output\") pod \"must-gather-kz8nt\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.183725 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmszc\" (UniqueName: \"kubernetes.io/projected/3f332980-1129-4e30-a022-8ab9019c060b-kube-api-access-gmszc\") pod \"must-gather-kz8nt\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.184666 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f332980-1129-4e30-a022-8ab9019c060b-must-gather-output\") pod \"must-gather-kz8nt\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.207582 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vpf4l"/"kube-root-ca.crt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.220751 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vpf4l"/"openshift-service-ca.crt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.235663 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmszc\" (UniqueName: \"kubernetes.io/projected/3f332980-1129-4e30-a022-8ab9019c060b-kube-api-access-gmszc\") pod \"must-gather-kz8nt\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.387227 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:29:02 crc kubenswrapper[4789]: I1216 09:29:02.931965 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vpf4l/must-gather-kz8nt"] Dec 16 09:29:03 crc kubenswrapper[4789]: I1216 09:29:03.510290 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" event={"ID":"3f332980-1129-4e30-a022-8ab9019c060b","Type":"ContainerStarted","Data":"829b45462f297e1e75d00ca7ad74b60309f3645b2a41c3024e9f40d2cc999b24"} Dec 16 09:29:11 crc kubenswrapper[4789]: I1216 09:29:11.596236 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" event={"ID":"3f332980-1129-4e30-a022-8ab9019c060b","Type":"ContainerStarted","Data":"a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b"} Dec 16 09:29:11 crc kubenswrapper[4789]: I1216 09:29:11.596652 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" event={"ID":"3f332980-1129-4e30-a022-8ab9019c060b","Type":"ContainerStarted","Data":"edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6"} Dec 16 09:29:11 crc kubenswrapper[4789]: I1216 09:29:11.618758 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" podStartSLOduration=2.70143119 podStartE2EDuration="10.618740831s" podCreationTimestamp="2025-12-16 09:29:01 +0000 UTC" firstStartedPulling="2025-12-16 09:29:02.990026519 +0000 UTC m=+9481.251914148" lastFinishedPulling="2025-12-16 09:29:10.90733616 +0000 UTC m=+9489.169223789" observedRunningTime="2025-12-16 09:29:11.615125083 +0000 UTC m=+9489.877012732" watchObservedRunningTime="2025-12-16 09:29:11.618740831 +0000 UTC m=+9489.880628460" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.486554 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vpf4l/crc-debug-lmczm"] Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.490684 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.493482 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vpf4l"/"default-dockercfg-d9fvf" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.668383 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00d93a37-e1c3-4b21-829a-1453c3de86a8-host\") pod \"crc-debug-lmczm\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.668654 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88m2k\" (UniqueName: \"kubernetes.io/projected/00d93a37-e1c3-4b21-829a-1453c3de86a8-kube-api-access-88m2k\") pod \"crc-debug-lmczm\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.770247 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88m2k\" (UniqueName: \"kubernetes.io/projected/00d93a37-e1c3-4b21-829a-1453c3de86a8-kube-api-access-88m2k\") pod \"crc-debug-lmczm\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.770319 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00d93a37-e1c3-4b21-829a-1453c3de86a8-host\") pod \"crc-debug-lmczm\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.770455 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00d93a37-e1c3-4b21-829a-1453c3de86a8-host\") pod \"crc-debug-lmczm\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.790809 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88m2k\" (UniqueName: \"kubernetes.io/projected/00d93a37-e1c3-4b21-829a-1453c3de86a8-kube-api-access-88m2k\") pod \"crc-debug-lmczm\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: I1216 09:29:15.809864 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:15 crc kubenswrapper[4789]: W1216 09:29:15.848894 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00d93a37_e1c3_4b21_829a_1453c3de86a8.slice/crio-0113004f2618618bf6e290ad3a3936cc0d482e7eef68847890d0b6d35cb07241 WatchSource:0}: Error finding container 0113004f2618618bf6e290ad3a3936cc0d482e7eef68847890d0b6d35cb07241: Status 404 returned error can't find the container with id 0113004f2618618bf6e290ad3a3936cc0d482e7eef68847890d0b6d35cb07241 Dec 16 09:29:16 crc kubenswrapper[4789]: I1216 09:29:16.639646 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" event={"ID":"00d93a37-e1c3-4b21-829a-1453c3de86a8","Type":"ContainerStarted","Data":"0113004f2618618bf6e290ad3a3936cc0d482e7eef68847890d0b6d35cb07241"} Dec 16 09:29:32 crc kubenswrapper[4789]: I1216 09:29:32.828484 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" event={"ID":"00d93a37-e1c3-4b21-829a-1453c3de86a8","Type":"ContainerStarted","Data":"6beab4ff81f0645ca005dbccb3bf431ce3da9e7c99403472daf6f1738b84d683"} Dec 16 09:29:32 crc kubenswrapper[4789]: I1216 09:29:32.848910 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" podStartSLOduration=1.9082858790000001 podStartE2EDuration="17.848890861s" podCreationTimestamp="2025-12-16 09:29:15 +0000 UTC" firstStartedPulling="2025-12-16 09:29:15.851859818 +0000 UTC m=+9494.113747447" lastFinishedPulling="2025-12-16 09:29:31.7924648 +0000 UTC m=+9510.054352429" observedRunningTime="2025-12-16 09:29:32.83823949 +0000 UTC m=+9511.100127229" watchObservedRunningTime="2025-12-16 09:29:32.848890861 +0000 UTC m=+9511.110778490" Dec 16 09:29:56 crc kubenswrapper[4789]: I1216 09:29:56.044308 4789 generic.go:334] "Generic (PLEG): container finished" podID="00d93a37-e1c3-4b21-829a-1453c3de86a8" containerID="6beab4ff81f0645ca005dbccb3bf431ce3da9e7c99403472daf6f1738b84d683" exitCode=0 Dec 16 09:29:56 crc kubenswrapper[4789]: I1216 09:29:56.044383 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" event={"ID":"00d93a37-e1c3-4b21-829a-1453c3de86a8","Type":"ContainerDied","Data":"6beab4ff81f0645ca005dbccb3bf431ce3da9e7c99403472daf6f1738b84d683"} Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.166385 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.209385 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vpf4l/crc-debug-lmczm"] Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.224485 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vpf4l/crc-debug-lmczm"] Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.328441 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88m2k\" (UniqueName: \"kubernetes.io/projected/00d93a37-e1c3-4b21-829a-1453c3de86a8-kube-api-access-88m2k\") pod \"00d93a37-e1c3-4b21-829a-1453c3de86a8\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.328508 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00d93a37-e1c3-4b21-829a-1453c3de86a8-host\") pod \"00d93a37-e1c3-4b21-829a-1453c3de86a8\" (UID: \"00d93a37-e1c3-4b21-829a-1453c3de86a8\") " Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.328684 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00d93a37-e1c3-4b21-829a-1453c3de86a8-host" (OuterVolumeSpecName: "host") pod "00d93a37-e1c3-4b21-829a-1453c3de86a8" (UID: "00d93a37-e1c3-4b21-829a-1453c3de86a8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.329145 4789 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00d93a37-e1c3-4b21-829a-1453c3de86a8-host\") on node \"crc\" DevicePath \"\"" Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.335044 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d93a37-e1c3-4b21-829a-1453c3de86a8-kube-api-access-88m2k" (OuterVolumeSpecName: "kube-api-access-88m2k") pod "00d93a37-e1c3-4b21-829a-1453c3de86a8" (UID: "00d93a37-e1c3-4b21-829a-1453c3de86a8"). InnerVolumeSpecName "kube-api-access-88m2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:29:57 crc kubenswrapper[4789]: I1216 09:29:57.431615 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88m2k\" (UniqueName: \"kubernetes.io/projected/00d93a37-e1c3-4b21-829a-1453c3de86a8-kube-api-access-88m2k\") on node \"crc\" DevicePath \"\"" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.066221 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0113004f2618618bf6e290ad3a3936cc0d482e7eef68847890d0b6d35cb07241" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.066290 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-lmczm" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.116383 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d93a37-e1c3-4b21-829a-1453c3de86a8" path="/var/lib/kubelet/pods/00d93a37-e1c3-4b21-829a-1453c3de86a8/volumes" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.369139 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vpf4l/crc-debug-cddt7"] Dec 16 09:29:58 crc kubenswrapper[4789]: E1216 09:29:58.369559 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d93a37-e1c3-4b21-829a-1453c3de86a8" containerName="container-00" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.369572 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d93a37-e1c3-4b21-829a-1453c3de86a8" containerName="container-00" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.369783 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d93a37-e1c3-4b21-829a-1453c3de86a8" containerName="container-00" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.370507 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.372682 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vpf4l"/"default-dockercfg-d9fvf" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.456210 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/717100a9-f015-425e-a7c8-38d9bfe5eede-host\") pod \"crc-debug-cddt7\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.456510 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rskss\" (UniqueName: \"kubernetes.io/projected/717100a9-f015-425e-a7c8-38d9bfe5eede-kube-api-access-rskss\") pod \"crc-debug-cddt7\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.558485 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/717100a9-f015-425e-a7c8-38d9bfe5eede-host\") pod \"crc-debug-cddt7\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.558617 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rskss\" (UniqueName: \"kubernetes.io/projected/717100a9-f015-425e-a7c8-38d9bfe5eede-kube-api-access-rskss\") pod \"crc-debug-cddt7\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.559396 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/717100a9-f015-425e-a7c8-38d9bfe5eede-host\") pod \"crc-debug-cddt7\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.576704 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rskss\" (UniqueName: \"kubernetes.io/projected/717100a9-f015-425e-a7c8-38d9bfe5eede-kube-api-access-rskss\") pod \"crc-debug-cddt7\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:58 crc kubenswrapper[4789]: I1216 09:29:58.687840 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:29:59 crc kubenswrapper[4789]: I1216 09:29:59.077414 4789 generic.go:334] "Generic (PLEG): container finished" podID="717100a9-f015-425e-a7c8-38d9bfe5eede" containerID="8952aea1eec988d85f721038b31839e0bbba2badc017fbd4767d7337ab254be3" exitCode=1 Dec 16 09:29:59 crc kubenswrapper[4789]: I1216 09:29:59.077475 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/crc-debug-cddt7" event={"ID":"717100a9-f015-425e-a7c8-38d9bfe5eede","Type":"ContainerDied","Data":"8952aea1eec988d85f721038b31839e0bbba2badc017fbd4767d7337ab254be3"} Dec 16 09:29:59 crc kubenswrapper[4789]: I1216 09:29:59.077503 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/crc-debug-cddt7" event={"ID":"717100a9-f015-425e-a7c8-38d9bfe5eede","Type":"ContainerStarted","Data":"2440b6f4ae211b6d41f2ec3fa6d58a5cc859b6214e55e04d3d13c45976bb5fa2"} Dec 16 09:29:59 crc kubenswrapper[4789]: I1216 09:29:59.127860 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vpf4l/crc-debug-cddt7"] Dec 16 09:29:59 crc kubenswrapper[4789]: I1216 09:29:59.138155 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vpf4l/crc-debug-cddt7"] Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.154637 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq"] Dec 16 09:30:00 crc kubenswrapper[4789]: E1216 09:30:00.155460 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717100a9-f015-425e-a7c8-38d9bfe5eede" containerName="container-00" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.155477 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="717100a9-f015-425e-a7c8-38d9bfe5eede" containerName="container-00" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.155723 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="717100a9-f015-425e-a7c8-38d9bfe5eede" containerName="container-00" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.157013 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.162089 4789 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.162132 4789 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.171367 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq"] Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.206663 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.294247 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/717100a9-f015-425e-a7c8-38d9bfe5eede-host\") pod \"717100a9-f015-425e-a7c8-38d9bfe5eede\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.294303 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rskss\" (UniqueName: \"kubernetes.io/projected/717100a9-f015-425e-a7c8-38d9bfe5eede-kube-api-access-rskss\") pod \"717100a9-f015-425e-a7c8-38d9bfe5eede\" (UID: \"717100a9-f015-425e-a7c8-38d9bfe5eede\") " Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.294505 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4bsf\" (UniqueName: \"kubernetes.io/projected/863604a5-5c8c-4936-9259-dd6a583fc896-kube-api-access-d4bsf\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.294533 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/863604a5-5c8c-4936-9259-dd6a583fc896-config-volume\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.294583 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/717100a9-f015-425e-a7c8-38d9bfe5eede-host" (OuterVolumeSpecName: "host") pod "717100a9-f015-425e-a7c8-38d9bfe5eede" (UID: "717100a9-f015-425e-a7c8-38d9bfe5eede"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.295213 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/863604a5-5c8c-4936-9259-dd6a583fc896-secret-volume\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.295379 4789 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/717100a9-f015-425e-a7c8-38d9bfe5eede-host\") on node \"crc\" DevicePath \"\"" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.300973 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717100a9-f015-425e-a7c8-38d9bfe5eede-kube-api-access-rskss" (OuterVolumeSpecName: "kube-api-access-rskss") pod "717100a9-f015-425e-a7c8-38d9bfe5eede" (UID: "717100a9-f015-425e-a7c8-38d9bfe5eede"). InnerVolumeSpecName "kube-api-access-rskss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.397423 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/863604a5-5c8c-4936-9259-dd6a583fc896-secret-volume\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.397480 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4bsf\" (UniqueName: \"kubernetes.io/projected/863604a5-5c8c-4936-9259-dd6a583fc896-kube-api-access-d4bsf\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.397498 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/863604a5-5c8c-4936-9259-dd6a583fc896-config-volume\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.397663 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rskss\" (UniqueName: \"kubernetes.io/projected/717100a9-f015-425e-a7c8-38d9bfe5eede-kube-api-access-rskss\") on node \"crc\" DevicePath \"\"" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.398590 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/863604a5-5c8c-4936-9259-dd6a583fc896-config-volume\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.402080 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/863604a5-5c8c-4936-9259-dd6a583fc896-secret-volume\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.416653 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4bsf\" (UniqueName: \"kubernetes.io/projected/863604a5-5c8c-4936-9259-dd6a583fc896-kube-api-access-d4bsf\") pod \"collect-profiles-29431290-z2skq\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.523948 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:00 crc kubenswrapper[4789]: W1216 09:30:00.973100 4789 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod863604a5_5c8c_4936_9259_dd6a583fc896.slice/crio-134cafc510f276ab8514c9ed6fb9ea64354d51ee94b7e5437a00b96647b292ad WatchSource:0}: Error finding container 134cafc510f276ab8514c9ed6fb9ea64354d51ee94b7e5437a00b96647b292ad: Status 404 returned error can't find the container with id 134cafc510f276ab8514c9ed6fb9ea64354d51ee94b7e5437a00b96647b292ad Dec 16 09:30:00 crc kubenswrapper[4789]: I1216 09:30:00.975586 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq"] Dec 16 09:30:01 crc kubenswrapper[4789]: I1216 09:30:01.108131 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" event={"ID":"863604a5-5c8c-4936-9259-dd6a583fc896","Type":"ContainerStarted","Data":"134cafc510f276ab8514c9ed6fb9ea64354d51ee94b7e5437a00b96647b292ad"} Dec 16 09:30:01 crc kubenswrapper[4789]: I1216 09:30:01.110954 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/crc-debug-cddt7" Dec 16 09:30:01 crc kubenswrapper[4789]: I1216 09:30:01.110784 4789 scope.go:117] "RemoveContainer" containerID="8952aea1eec988d85f721038b31839e0bbba2badc017fbd4767d7337ab254be3" Dec 16 09:30:02 crc kubenswrapper[4789]: I1216 09:30:02.118642 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717100a9-f015-425e-a7c8-38d9bfe5eede" path="/var/lib/kubelet/pods/717100a9-f015-425e-a7c8-38d9bfe5eede/volumes" Dec 16 09:30:02 crc kubenswrapper[4789]: I1216 09:30:02.128797 4789 generic.go:334] "Generic (PLEG): container finished" podID="863604a5-5c8c-4936-9259-dd6a583fc896" containerID="e57eaea11214b3bb315378290dab6e3e1b8915a5e556b964e2768ebe74d84b54" exitCode=0 Dec 16 09:30:02 crc kubenswrapper[4789]: I1216 09:30:02.128873 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" event={"ID":"863604a5-5c8c-4936-9259-dd6a583fc896","Type":"ContainerDied","Data":"e57eaea11214b3bb315378290dab6e3e1b8915a5e556b964e2768ebe74d84b54"} Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.567592 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.661885 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/863604a5-5c8c-4936-9259-dd6a583fc896-secret-volume\") pod \"863604a5-5c8c-4936-9259-dd6a583fc896\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.662047 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/863604a5-5c8c-4936-9259-dd6a583fc896-config-volume\") pod \"863604a5-5c8c-4936-9259-dd6a583fc896\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.662143 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4bsf\" (UniqueName: \"kubernetes.io/projected/863604a5-5c8c-4936-9259-dd6a583fc896-kube-api-access-d4bsf\") pod \"863604a5-5c8c-4936-9259-dd6a583fc896\" (UID: \"863604a5-5c8c-4936-9259-dd6a583fc896\") " Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.664306 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/863604a5-5c8c-4936-9259-dd6a583fc896-config-volume" (OuterVolumeSpecName: "config-volume") pod "863604a5-5c8c-4936-9259-dd6a583fc896" (UID: "863604a5-5c8c-4936-9259-dd6a583fc896"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.668700 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863604a5-5c8c-4936-9259-dd6a583fc896-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "863604a5-5c8c-4936-9259-dd6a583fc896" (UID: "863604a5-5c8c-4936-9259-dd6a583fc896"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.668784 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863604a5-5c8c-4936-9259-dd6a583fc896-kube-api-access-d4bsf" (OuterVolumeSpecName: "kube-api-access-d4bsf") pod "863604a5-5c8c-4936-9259-dd6a583fc896" (UID: "863604a5-5c8c-4936-9259-dd6a583fc896"). InnerVolumeSpecName "kube-api-access-d4bsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.764149 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4bsf\" (UniqueName: \"kubernetes.io/projected/863604a5-5c8c-4936-9259-dd6a583fc896-kube-api-access-d4bsf\") on node \"crc\" DevicePath \"\"" Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.764179 4789 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/863604a5-5c8c-4936-9259-dd6a583fc896-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:30:03 crc kubenswrapper[4789]: I1216 09:30:03.764192 4789 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/863604a5-5c8c-4936-9259-dd6a583fc896-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 09:30:04 crc kubenswrapper[4789]: I1216 09:30:04.153755 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" event={"ID":"863604a5-5c8c-4936-9259-dd6a583fc896","Type":"ContainerDied","Data":"134cafc510f276ab8514c9ed6fb9ea64354d51ee94b7e5437a00b96647b292ad"} Dec 16 09:30:04 crc kubenswrapper[4789]: I1216 09:30:04.153795 4789 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="134cafc510f276ab8514c9ed6fb9ea64354d51ee94b7e5437a00b96647b292ad" Dec 16 09:30:04 crc kubenswrapper[4789]: I1216 09:30:04.154046 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431290-z2skq" Dec 16 09:30:04 crc kubenswrapper[4789]: I1216 09:30:04.663898 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8"] Dec 16 09:30:04 crc kubenswrapper[4789]: I1216 09:30:04.677650 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431245-zk4h8"] Dec 16 09:30:06 crc kubenswrapper[4789]: I1216 09:30:06.117297 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8549517b-5f73-46a0-805f-2c30803def4a" path="/var/lib/kubelet/pods/8549517b-5f73-46a0-805f-2c30803def4a/volumes" Dec 16 09:30:21 crc kubenswrapper[4789]: I1216 09:30:21.927773 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:30:21 crc kubenswrapper[4789]: I1216 09:30:21.928526 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:30:36 crc kubenswrapper[4789]: I1216 09:30:36.042631 4789 scope.go:117] "RemoveContainer" containerID="961726de38b39a84f1131521ea58dcfefb99e0133cf780baa96fb75be604e06f" Dec 16 09:30:51 crc kubenswrapper[4789]: I1216 09:30:51.928349 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:30:51 crc kubenswrapper[4789]: I1216 09:30:51.929030 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:31:21 crc kubenswrapper[4789]: I1216 09:31:21.927496 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:31:21 crc kubenswrapper[4789]: I1216 09:31:21.928078 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:31:21 crc kubenswrapper[4789]: I1216 09:31:21.928121 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 09:31:21 crc kubenswrapper[4789]: I1216 09:31:21.928858 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:31:21 crc kubenswrapper[4789]: I1216 09:31:21.928928 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" gracePeriod=600 Dec 16 09:31:22 crc kubenswrapper[4789]: E1216 09:31:22.049471 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:31:22 crc kubenswrapper[4789]: I1216 09:31:22.853412 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" exitCode=0 Dec 16 09:31:22 crc kubenswrapper[4789]: I1216 09:31:22.853498 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc"} Dec 16 09:31:22 crc kubenswrapper[4789]: I1216 09:31:22.853761 4789 scope.go:117] "RemoveContainer" containerID="64dd59044e7fc6d9f319f7b4fd3998634b32358794abdb51b6987acb14b1ba40" Dec 16 09:31:22 crc kubenswrapper[4789]: I1216 09:31:22.854459 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:31:22 crc kubenswrapper[4789]: E1216 09:31:22.854924 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:31:36 crc kubenswrapper[4789]: I1216 09:31:36.117495 4789 scope.go:117] "RemoveContainer" containerID="251300adaf1932159f9f873a4dda3087906c21ab7bbff92d5ce4402742ce0203" Dec 16 09:31:36 crc kubenswrapper[4789]: I1216 09:31:36.142385 4789 scope.go:117] "RemoveContainer" containerID="09cec9e972a3abf014aa4a336b96b29716635bc1781048c2c1a50942c52e5083" Dec 16 09:31:36 crc kubenswrapper[4789]: I1216 09:31:36.174503 4789 scope.go:117] "RemoveContainer" containerID="f221bf446a4e66b91d7590201a80937c05d13d4be49838f61b16a027f239062f" Dec 16 09:31:38 crc kubenswrapper[4789]: I1216 09:31:38.105822 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:31:38 crc kubenswrapper[4789]: E1216 09:31:38.106690 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:31:50 crc kubenswrapper[4789]: I1216 09:31:50.105402 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:31:50 crc kubenswrapper[4789]: E1216 09:31:50.106894 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:32:01 crc kubenswrapper[4789]: I1216 09:32:01.106116 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:32:01 crc kubenswrapper[4789]: E1216 09:32:01.107058 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:32:15 crc kubenswrapper[4789]: I1216 09:32:15.105946 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:32:15 crc kubenswrapper[4789]: E1216 09:32:15.106705 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:32:30 crc kubenswrapper[4789]: I1216 09:32:30.105248 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:32:30 crc kubenswrapper[4789]: E1216 09:32:30.105958 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:32:39 crc kubenswrapper[4789]: I1216 09:32:39.732208 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_519395da-f8ab-483d-ae5f-adb8e234939d/init-config-reloader/0.log" Dec 16 09:32:39 crc kubenswrapper[4789]: I1216 09:32:39.912909 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_519395da-f8ab-483d-ae5f-adb8e234939d/init-config-reloader/0.log" Dec 16 09:32:39 crc kubenswrapper[4789]: I1216 09:32:39.958805 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_519395da-f8ab-483d-ae5f-adb8e234939d/alertmanager/0.log" Dec 16 09:32:39 crc kubenswrapper[4789]: I1216 09:32:39.966464 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_519395da-f8ab-483d-ae5f-adb8e234939d/config-reloader/0.log" Dec 16 09:32:40 crc kubenswrapper[4789]: I1216 09:32:40.402500 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6a6f0826-3133-40ba-9139-b5cec2b92a29/aodh-api/0.log" Dec 16 09:32:40 crc kubenswrapper[4789]: I1216 09:32:40.402678 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6a6f0826-3133-40ba-9139-b5cec2b92a29/aodh-listener/0.log" Dec 16 09:32:40 crc kubenswrapper[4789]: I1216 09:32:40.479925 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6a6f0826-3133-40ba-9139-b5cec2b92a29/aodh-evaluator/0.log" Dec 16 09:32:40 crc kubenswrapper[4789]: I1216 09:32:40.573256 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6a6f0826-3133-40ba-9139-b5cec2b92a29/aodh-notifier/0.log" Dec 16 09:32:40 crc kubenswrapper[4789]: I1216 09:32:40.701641 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55c9b758c6-tbswb_a6470a1d-33f9-4895-b6af-f797aedf568e/barbican-api/0.log" Dec 16 09:32:40 crc kubenswrapper[4789]: I1216 09:32:40.706514 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55c9b758c6-tbswb_a6470a1d-33f9-4895-b6af-f797aedf568e/barbican-api-log/0.log" Dec 16 09:32:40 crc kubenswrapper[4789]: I1216 09:32:40.890406 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7df795b6b4-6qb9j_58a03d3a-03c2-47af-872b-2aed04c99bbc/barbican-keystone-listener/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.298426 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7df795b6b4-6qb9j_58a03d3a-03c2-47af-872b-2aed04c99bbc/barbican-keystone-listener-log/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.315469 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d594f4d55-cjsm4_06e922aa-fb79-405f-afd2-dc07a0bc8809/barbican-worker/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.381237 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d594f4d55-cjsm4_06e922aa-fb79-405f-afd2-dc07a0bc8809/barbican-worker-log/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.545462 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-zf49z_d5635ad5-e918-492f-b2b3-8e8893ba73e1/bootstrap-openstack-openstack-cell1/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.668513 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f488b83-1fc0-40a5-be04-50e5267d4792/ceilometer-central-agent/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.720952 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f488b83-1fc0-40a5-be04-50e5267d4792/ceilometer-notification-agent/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.765661 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f488b83-1fc0-40a5-be04-50e5267d4792/proxy-httpd/0.log" Dec 16 09:32:41 crc kubenswrapper[4789]: I1216 09:32:41.823625 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f488b83-1fc0-40a5-be04-50e5267d4792/sg-core/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.034689 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-vltnx_19b99655-7a2f-4367-9b3f-c0897a02bed3/ceph-client-openstack-openstack-cell1/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.132500 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_04bb6971-b904-45a9-92a2-fda570c52dcd/cinder-api/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.362094 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_04bb6971-b904-45a9-92a2-fda570c52dcd/cinder-api-log/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.526186 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5f480915-7f85-4e43-a3b6-63303a284b70/probe/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.544525 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5f480915-7f85-4e43-a3b6-63303a284b70/cinder-backup/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.654218 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f1ad6601-c17e-4847-b540-8bc8d8997934/cinder-scheduler/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.725755 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f1ad6601-c17e-4847-b540-8bc8d8997934/probe/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.897230 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b26fb36c-42c2-4316-bab9-af89a7e7df12/cinder-volume/0.log" Dec 16 09:32:42 crc kubenswrapper[4789]: I1216 09:32:42.926956 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_b26fb36c-42c2-4316-bab9-af89a7e7df12/probe/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.002052 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-v8jrp_7e556aea-3590-4797-a0f7-27cfbc22be03/configure-network-openstack-openstack-cell1/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.160051 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-k4rq4_408cd4a2-4575-49af-992b-a5f2dde363ef/configure-os-openstack-openstack-cell1/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.211048 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8494b7758f-dvnll_00a6f21f-4c8b-423c-b645-2e9ff6222c95/init/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.406680 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-9qlnq_4eaf1fcf-a995-49f8-89b5-fb771151402f/download-cache-openstack-openstack-cell1/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.409178 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8494b7758f-dvnll_00a6f21f-4c8b-423c-b645-2e9ff6222c95/init/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.455859 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8494b7758f-dvnll_00a6f21f-4c8b-423c-b645-2e9ff6222c95/dnsmasq-dns/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.612234 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9/glance-log/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.619736 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_314a0a2d-bfe2-41ed-8a0d-f3715dbe6be9/glance-httpd/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.807315 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_979e6177-3aa1-44ed-bfa3-aa69902ad292/glance-httpd/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.880165 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_979e6177-3aa1-44ed-bfa3-aa69902ad292/glance-log/0.log" Dec 16 09:32:43 crc kubenswrapper[4789]: I1216 09:32:43.934526 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5b79b95f86-txdfg_ac197a7d-175c-4aec-b5cf-cfa32de39925/heat-api/0.log" Dec 16 09:32:44 crc kubenswrapper[4789]: I1216 09:32:44.142446 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-c6745d44c-ww76k_8d1b396a-0632-4f6c-9668-dd2cb3038923/heat-cfnapi/0.log" Dec 16 09:32:44 crc kubenswrapper[4789]: I1216 09:32:44.187314 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-754bb6b78c-jqn25_c203e891-92ba-4644-8138-b8375640c961/heat-engine/0.log" Dec 16 09:32:44 crc kubenswrapper[4789]: I1216 09:32:44.433641 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58fbf69d97-87vvw_f6dcac86-7cd3-427c-a5a3-24b2d4c02361/horizon/0.log" Dec 16 09:32:44 crc kubenswrapper[4789]: I1216 09:32:44.446156 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-2mkth_839706dd-4b2b-4821-9d5b-374e1f23f6bf/install-certs-openstack-openstack-cell1/0.log" Dec 16 09:32:44 crc kubenswrapper[4789]: I1216 09:32:44.469084 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-58fbf69d97-87vvw_f6dcac86-7cd3-427c-a5a3-24b2d4c02361/horizon-log/0.log" Dec 16 09:32:44 crc kubenswrapper[4789]: I1216 09:32:44.845964 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-zhhd9_451b6be9-d35b-4c1a-b4ce-448dcb086baf/install-os-openstack-openstack-cell1/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.017948 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29431261-m4757_8d1fba60-e7e3-4cb8-9b09-859d467c1f62/keystone-cron/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.107892 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:32:45 crc kubenswrapper[4789]: E1216 09:32:45.108257 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.287228 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-4lrbh_4cb7847a-6a82-44b8-a1da-6583cb76efc8/libvirt-openstack-openstack-cell1/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.300280 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_bf90e140-074b-4515-bf9f-827a89acbce4/kube-state-metrics/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.313994 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6c9ff75b57-xt8cn_88b34b95-4592-4ddc-ac54-686f169961d0/keystone-api/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.586401 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_c7fe2a8b-685e-46d1-8890-7f4a0d752135/manila-api-log/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.617380 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_c7fe2a8b-685e-46d1-8890-7f4a0d752135/manila-api/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.639256 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7f770399-9fac-4410-b6be-ecb2830512c5/manila-scheduler/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.676531 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7f770399-9fac-4410-b6be-ecb2830512c5/probe/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.803661 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_7a00d6fa-713d-4297-b5cc-7ca06b736d65/probe/0.log" Dec 16 09:32:45 crc kubenswrapper[4789]: I1216 09:32:45.933611 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_7a00d6fa-713d-4297-b5cc-7ca06b736d65/manila-share/0.log" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.162128 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q77qx"] Dec 16 09:32:46 crc kubenswrapper[4789]: E1216 09:32:46.163272 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863604a5-5c8c-4936-9259-dd6a583fc896" containerName="collect-profiles" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.163288 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="863604a5-5c8c-4936-9259-dd6a583fc896" containerName="collect-profiles" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.163685 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="863604a5-5c8c-4936-9259-dd6a583fc896" containerName="collect-profiles" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.166552 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.180294 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q77qx"] Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.312431 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-catalog-content\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.312693 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbcvq\" (UniqueName: \"kubernetes.io/projected/b381060d-638a-4efb-87ba-41d4db6dbd98-kube-api-access-tbcvq\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.312926 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-utilities\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.414485 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbcvq\" (UniqueName: \"kubernetes.io/projected/b381060d-638a-4efb-87ba-41d4db6dbd98-kube-api-access-tbcvq\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.414584 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-utilities\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.414650 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-catalog-content\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.415123 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-utilities\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.415322 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-catalog-content\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.433937 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d9f7fc5b5-p8zdl_7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07/neutron-httpd/0.log" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.475821 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbcvq\" (UniqueName: \"kubernetes.io/projected/b381060d-638a-4efb-87ba-41d4db6dbd98-kube-api-access-tbcvq\") pod \"certified-operators-q77qx\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.484491 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-h74df_ffb74c16-ff91-4d62-a5ea-c381a02c0768/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.502465 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:46 crc kubenswrapper[4789]: I1216 09:32:46.955871 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d9f7fc5b5-p8zdl_7bf8b381-0e40-4ec0-9cd2-9b5752ec9e07/neutron-api/0.log" Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.035214 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-n2xkz_74959a3a-150a-4441-a8d3-b717d73415ca/neutron-metadata-openstack-openstack-cell1/0.log" Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.116284 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q77qx"] Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.324510 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-l888x_4aed6986-dbe5-45bd-84e6-a1e31c1a89be/neutron-sriov-openstack-openstack-cell1/0.log" Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.528675 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a7cb033e-87a1-40d6-bf9f-0be6422d7493/nova-api-api/0.log" Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.611350 4789 generic.go:334] "Generic (PLEG): container finished" podID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerID="9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946" exitCode=0 Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.611401 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77qx" event={"ID":"b381060d-638a-4efb-87ba-41d4db6dbd98","Type":"ContainerDied","Data":"9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946"} Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.611436 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77qx" event={"ID":"b381060d-638a-4efb-87ba-41d4db6dbd98","Type":"ContainerStarted","Data":"92e1c9a2d496548f0990585e09d31715610ebf25e2dfb547cd6bda52a9445752"} Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.726538 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a7cb033e-87a1-40d6-bf9f-0be6422d7493/nova-api-log/0.log" Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.749725 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e404e342-e901-43fc-9652-6c4c67a65469/nova-cell0-conductor-conductor/0.log" Dec 16 09:32:47 crc kubenswrapper[4789]: I1216 09:32:47.919771 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a7493e41-dae1-4d90-a734-fda98dd32937/nova-cell1-conductor-conductor/0.log" Dec 16 09:32:48 crc kubenswrapper[4789]: I1216 09:32:48.089188 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_84c5c815-268d-487d-a36e-4f9db5e4ae44/nova-cell1-novncproxy-novncproxy/0.log" Dec 16 09:32:48 crc kubenswrapper[4789]: I1216 09:32:48.248589 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellnrs8w_12ce9e20-a637-474c-862b-a8c47381fda9/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 16 09:32:48 crc kubenswrapper[4789]: I1216 09:32:48.633375 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77qx" event={"ID":"b381060d-638a-4efb-87ba-41d4db6dbd98","Type":"ContainerStarted","Data":"830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496"} Dec 16 09:32:48 crc kubenswrapper[4789]: I1216 09:32:48.769688 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-c7bll_f2844d2e-202c-470b-9bb9-cb0506134f3c/nova-cell1-openstack-openstack-cell1/0.log" Dec 16 09:32:48 crc kubenswrapper[4789]: I1216 09:32:48.803995 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791/nova-metadata-log/0.log" Dec 16 09:32:48 crc kubenswrapper[4789]: I1216 09:32:48.882163 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b3e8043f-1e4d-41a1-92f6-9cfa3ad8b791/nova-metadata-metadata/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.088472 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c38d7e41-cc58-418e-8988-969ed80309c0/nova-scheduler-scheduler/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.217517 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46553071-2569-4448-bd5c-f5862a4e71f5/mysql-bootstrap/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.313294 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46553071-2569-4448-bd5c-f5862a4e71f5/mysql-bootstrap/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.548413 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b040e505-3d77-42ec-b501-1b6fd0799640/mysql-bootstrap/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.555022 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46553071-2569-4448-bd5c-f5862a4e71f5/galera/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.746966 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b040e505-3d77-42ec-b501-1b6fd0799640/mysql-bootstrap/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.789239 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b040e505-3d77-42ec-b501-1b6fd0799640/galera/0.log" Dec 16 09:32:49 crc kubenswrapper[4789]: I1216 09:32:49.813237 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4731a9c9-432c-4a87-8809-062a075bae7d/openstackclient/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.063236 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2e8dd6cb-78cb-41ae-88e0-2a0b3720d598/openstack-network-exporter/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.063780 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2e8dd6cb-78cb-41ae-88e0-2a0b3720d598/ovn-northd/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.319132 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_33ce8208-1afd-4f72-bda9-cdb9017e3e51/openstack-network-exporter/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.482662 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-ppzs5_e08f18e5-cd25-40b5-a8fa-2af2530846f4/ovn-openstack-openstack-cell1/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.513650 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_33ce8208-1afd-4f72-bda9-cdb9017e3e51/ovsdbserver-nb/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.613022 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_0e439248-3fed-4550-9d89-8fec7e155a09/openstack-network-exporter/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.657056 4789 generic.go:334] "Generic (PLEG): container finished" podID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerID="830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496" exitCode=0 Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.657321 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77qx" event={"ID":"b381060d-638a-4efb-87ba-41d4db6dbd98","Type":"ContainerDied","Data":"830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496"} Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.748415 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_0e439248-3fed-4550-9d89-8fec7e155a09/ovsdbserver-nb/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.879257 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_a51d9137-cd54-4a8d-8217-cebbf247f188/openstack-network-exporter/0.log" Dec 16 09:32:50 crc kubenswrapper[4789]: I1216 09:32:50.881783 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_a51d9137-cd54-4a8d-8217-cebbf247f188/ovsdbserver-nb/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.016602 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_76507e71-1eee-4984-9c27-631ab3a139f3/openstack-network-exporter/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.142353 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_76507e71-1eee-4984-9c27-631ab3a139f3/ovsdbserver-sb/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.323022 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_042af17b-3a9b-43a0-b270-f52582835b5a/openstack-network-exporter/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.439796 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_042af17b-3a9b-43a0-b270-f52582835b5a/ovsdbserver-sb/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.490735 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5eb2a930-2fde-4ead-a1d6-6b319fddafc7/openstack-network-exporter/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.603512 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5eb2a930-2fde-4ead-a1d6-6b319fddafc7/ovsdbserver-sb/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.669831 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77qx" event={"ID":"b381060d-638a-4efb-87ba-41d4db6dbd98","Type":"ContainerStarted","Data":"6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37"} Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.703752 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q77qx" podStartSLOduration=2.116326615 podStartE2EDuration="5.703726506s" podCreationTimestamp="2025-12-16 09:32:46 +0000 UTC" firstStartedPulling="2025-12-16 09:32:47.613202933 +0000 UTC m=+9705.875090562" lastFinishedPulling="2025-12-16 09:32:51.200602824 +0000 UTC m=+9709.462490453" observedRunningTime="2025-12-16 09:32:51.69611022 +0000 UTC m=+9709.957997849" watchObservedRunningTime="2025-12-16 09:32:51.703726506 +0000 UTC m=+9709.965614135" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.933043 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86f8f549f6-wsg8b_aedac820-75c0-4fe8-865d-39225c3f8b09/placement-api/0.log" Dec 16 09:32:51 crc kubenswrapper[4789]: I1216 09:32:51.958051 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86f8f549f6-wsg8b_aedac820-75c0-4fe8-865d-39225c3f8b09/placement-log/0.log" Dec 16 09:32:52 crc kubenswrapper[4789]: I1216 09:32:52.031162 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c7kkgl_9a8c6e87-54ee-4a74-9754-6eace44ccce0/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 16 09:32:52 crc kubenswrapper[4789]: I1216 09:32:52.245174 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a938c46-1e0f-4af2-873c-de4472f39a2b/init-config-reloader/0.log" Dec 16 09:32:52 crc kubenswrapper[4789]: I1216 09:32:52.657328 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a938c46-1e0f-4af2-873c-de4472f39a2b/init-config-reloader/0.log" Dec 16 09:32:52 crc kubenswrapper[4789]: I1216 09:32:52.742181 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a938c46-1e0f-4af2-873c-de4472f39a2b/config-reloader/0.log" Dec 16 09:32:52 crc kubenswrapper[4789]: I1216 09:32:52.815302 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a938c46-1e0f-4af2-873c-de4472f39a2b/prometheus/0.log" Dec 16 09:32:52 crc kubenswrapper[4789]: I1216 09:32:52.831146 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_1a938c46-1e0f-4af2-873c-de4472f39a2b/thanos-sidecar/0.log" Dec 16 09:32:53 crc kubenswrapper[4789]: I1216 09:32:53.054985 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f88e8a07-49e9-4e55-9b79-18990a74ac97/setup-container/0.log" Dec 16 09:32:53 crc kubenswrapper[4789]: I1216 09:32:53.350776 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f88e8a07-49e9-4e55-9b79-18990a74ac97/setup-container/0.log" Dec 16 09:32:53 crc kubenswrapper[4789]: I1216 09:32:53.356125 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1b059bdb-f5c3-47eb-88f4-b89b3529450c/setup-container/0.log" Dec 16 09:32:53 crc kubenswrapper[4789]: I1216 09:32:53.412473 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f88e8a07-49e9-4e55-9b79-18990a74ac97/rabbitmq/0.log" Dec 16 09:32:53 crc kubenswrapper[4789]: I1216 09:32:53.696406 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1b059bdb-f5c3-47eb-88f4-b89b3529450c/setup-container/0.log" Dec 16 09:32:53 crc kubenswrapper[4789]: I1216 09:32:53.733313 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1b059bdb-f5c3-47eb-88f4-b89b3529450c/rabbitmq/0.log" Dec 16 09:32:53 crc kubenswrapper[4789]: I1216 09:32:53.962652 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-6m8gq_ea1d73ed-d948-4c5a-bda3-c4f13fc0572c/reboot-os-openstack-openstack-cell1/0.log" Dec 16 09:32:54 crc kubenswrapper[4789]: I1216 09:32:54.083854 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-hsc2c_d5ff0c7a-b121-4c2d-a17e-acb58761e419/run-os-openstack-openstack-cell1/0.log" Dec 16 09:32:54 crc kubenswrapper[4789]: I1216 09:32:54.285495 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-527hf_5405dbb7-1841-4e80-a4a4-08513cb61917/ssh-known-hosts-openstack/0.log" Dec 16 09:32:54 crc kubenswrapper[4789]: I1216 09:32:54.776131 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-zkjlb_ea52433e-1eda-40ec-8bb9-32652828eeec/telemetry-openstack-openstack-cell1/0.log" Dec 16 09:32:54 crc kubenswrapper[4789]: I1216 09:32:54.808116 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ea4afd4b-996e-4079-83b9-f2c3e8242de1/tempest-tests-tempest-tests-runner/0.log" Dec 16 09:32:54 crc kubenswrapper[4789]: I1216 09:32:54.920134 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c40d0045-a36a-4ad8-bb95-24d7f2f02230/test-operator-logs-container/0.log" Dec 16 09:32:55 crc kubenswrapper[4789]: I1216 09:32:55.142334 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-z628q_18288168-e59a-407b-99e1-0a8f2a73109d/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 16 09:32:55 crc kubenswrapper[4789]: I1216 09:32:55.238979 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-4zxgp_ae3c5bd6-2381-4bd4-8567-f2fecac95765/validate-network-openstack-openstack-cell1/0.log" Dec 16 09:32:56 crc kubenswrapper[4789]: I1216 09:32:56.111771 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:32:56 crc kubenswrapper[4789]: E1216 09:32:56.112819 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:32:56 crc kubenswrapper[4789]: I1216 09:32:56.503634 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:56 crc kubenswrapper[4789]: I1216 09:32:56.503721 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:56 crc kubenswrapper[4789]: I1216 09:32:56.560825 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:56 crc kubenswrapper[4789]: I1216 09:32:56.766154 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:56 crc kubenswrapper[4789]: I1216 09:32:56.840871 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q77qx"] Dec 16 09:32:58 crc kubenswrapper[4789]: I1216 09:32:58.748421 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q77qx" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="registry-server" containerID="cri-o://6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37" gracePeriod=2 Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.302775 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.408770 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbcvq\" (UniqueName: \"kubernetes.io/projected/b381060d-638a-4efb-87ba-41d4db6dbd98-kube-api-access-tbcvq\") pod \"b381060d-638a-4efb-87ba-41d4db6dbd98\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.409077 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-catalog-content\") pod \"b381060d-638a-4efb-87ba-41d4db6dbd98\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.409114 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-utilities\") pod \"b381060d-638a-4efb-87ba-41d4db6dbd98\" (UID: \"b381060d-638a-4efb-87ba-41d4db6dbd98\") " Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.410048 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-utilities" (OuterVolumeSpecName: "utilities") pod "b381060d-638a-4efb-87ba-41d4db6dbd98" (UID: "b381060d-638a-4efb-87ba-41d4db6dbd98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.410343 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.458978 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b381060d-638a-4efb-87ba-41d4db6dbd98" (UID: "b381060d-638a-4efb-87ba-41d4db6dbd98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.512126 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b381060d-638a-4efb-87ba-41d4db6dbd98-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.679791 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b381060d-638a-4efb-87ba-41d4db6dbd98-kube-api-access-tbcvq" (OuterVolumeSpecName: "kube-api-access-tbcvq") pod "b381060d-638a-4efb-87ba-41d4db6dbd98" (UID: "b381060d-638a-4efb-87ba-41d4db6dbd98"). InnerVolumeSpecName "kube-api-access-tbcvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.716800 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbcvq\" (UniqueName: \"kubernetes.io/projected/b381060d-638a-4efb-87ba-41d4db6dbd98-kube-api-access-tbcvq\") on node \"crc\" DevicePath \"\"" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.781684 4789 generic.go:334] "Generic (PLEG): container finished" podID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerID="6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37" exitCode=0 Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.781735 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77qx" event={"ID":"b381060d-638a-4efb-87ba-41d4db6dbd98","Type":"ContainerDied","Data":"6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37"} Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.781765 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77qx" event={"ID":"b381060d-638a-4efb-87ba-41d4db6dbd98","Type":"ContainerDied","Data":"92e1c9a2d496548f0990585e09d31715610ebf25e2dfb547cd6bda52a9445752"} Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.781785 4789 scope.go:117] "RemoveContainer" containerID="6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.782084 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77qx" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.832458 4789 scope.go:117] "RemoveContainer" containerID="830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.855083 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q77qx"] Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.872903 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q77qx"] Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.884938 4789 scope.go:117] "RemoveContainer" containerID="9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.942016 4789 scope.go:117] "RemoveContainer" containerID="6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37" Dec 16 09:32:59 crc kubenswrapper[4789]: E1216 09:32:59.942434 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37\": container with ID starting with 6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37 not found: ID does not exist" containerID="6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.942484 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37"} err="failed to get container status \"6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37\": rpc error: code = NotFound desc = could not find container \"6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37\": container with ID starting with 6b1bf9bed81ffe4b637bd93a286c93dacdb3e1e268a9880163414357fd956d37 not found: ID does not exist" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.942514 4789 scope.go:117] "RemoveContainer" containerID="830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496" Dec 16 09:32:59 crc kubenswrapper[4789]: E1216 09:32:59.943094 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496\": container with ID starting with 830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496 not found: ID does not exist" containerID="830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.943123 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496"} err="failed to get container status \"830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496\": rpc error: code = NotFound desc = could not find container \"830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496\": container with ID starting with 830d71855d9e5ae380c4bf3d1949392379d12a8550e4d94830872fba45616496 not found: ID does not exist" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.943140 4789 scope.go:117] "RemoveContainer" containerID="9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946" Dec 16 09:32:59 crc kubenswrapper[4789]: E1216 09:32:59.944093 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946\": container with ID starting with 9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946 not found: ID does not exist" containerID="9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946" Dec 16 09:32:59 crc kubenswrapper[4789]: I1216 09:32:59.944132 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946"} err="failed to get container status \"9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946\": rpc error: code = NotFound desc = could not find container \"9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946\": container with ID starting with 9add013389a517a1e9b3b980cf7c1efafe81551e78ff76b6b42fd7f891ec4946 not found: ID does not exist" Dec 16 09:33:00 crc kubenswrapper[4789]: I1216 09:33:00.119399 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" path="/var/lib/kubelet/pods/b381060d-638a-4efb-87ba-41d4db6dbd98/volumes" Dec 16 09:33:08 crc kubenswrapper[4789]: I1216 09:33:08.106363 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:33:08 crc kubenswrapper[4789]: E1216 09:33:08.107124 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.117263 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8pkgj"] Dec 16 09:33:12 crc kubenswrapper[4789]: E1216 09:33:12.129962 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="extract-utilities" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.129986 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="extract-utilities" Dec 16 09:33:12 crc kubenswrapper[4789]: E1216 09:33:12.130014 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="registry-server" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.130020 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="registry-server" Dec 16 09:33:12 crc kubenswrapper[4789]: E1216 09:33:12.130061 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="extract-content" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.130071 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="extract-content" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.130382 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="b381060d-638a-4efb-87ba-41d4db6dbd98" containerName="registry-server" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.132333 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pkgj"] Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.132482 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.291320 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-utilities\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.291387 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-catalog-content\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.291479 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqgfm\" (UniqueName: \"kubernetes.io/projected/558104a4-425f-401e-89fd-6338334eb937-kube-api-access-jqgfm\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.393109 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-utilities\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.393216 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-catalog-content\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.393319 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqgfm\" (UniqueName: \"kubernetes.io/projected/558104a4-425f-401e-89fd-6338334eb937-kube-api-access-jqgfm\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.394184 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-utilities\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.394800 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-catalog-content\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.433026 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqgfm\" (UniqueName: \"kubernetes.io/projected/558104a4-425f-401e-89fd-6338334eb937-kube-api-access-jqgfm\") pod \"redhat-operators-8pkgj\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.458975 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:12 crc kubenswrapper[4789]: I1216 09:33:12.957627 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pkgj"] Dec 16 09:33:13 crc kubenswrapper[4789]: I1216 09:33:13.089609 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1e364136-8097-48a4-ad88-c3fc2967154d/memcached/0.log" Dec 16 09:33:13 crc kubenswrapper[4789]: I1216 09:33:13.953634 4789 generic.go:334] "Generic (PLEG): container finished" podID="558104a4-425f-401e-89fd-6338334eb937" containerID="9ead2433c28cc839c6f973a45cebdd58a83ea1634e4bdc6196162796d0600dac" exitCode=0 Dec 16 09:33:13 crc kubenswrapper[4789]: I1216 09:33:13.953745 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pkgj" event={"ID":"558104a4-425f-401e-89fd-6338334eb937","Type":"ContainerDied","Data":"9ead2433c28cc839c6f973a45cebdd58a83ea1634e4bdc6196162796d0600dac"} Dec 16 09:33:13 crc kubenswrapper[4789]: I1216 09:33:13.953944 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pkgj" event={"ID":"558104a4-425f-401e-89fd-6338334eb937","Type":"ContainerStarted","Data":"4ea8627030fcd04bad7c751634414c2ca370f5bd7fa97df2697ae0f4cc6a6bfd"} Dec 16 09:33:13 crc kubenswrapper[4789]: I1216 09:33:13.955642 4789 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 09:33:14 crc kubenswrapper[4789]: I1216 09:33:14.963434 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pkgj" event={"ID":"558104a4-425f-401e-89fd-6338334eb937","Type":"ContainerStarted","Data":"5a7163db511e10c9ab084f1a83a82d5379cf9a9d295beddcb5fea8c8dba886f5"} Dec 16 09:33:18 crc kubenswrapper[4789]: I1216 09:33:18.002494 4789 generic.go:334] "Generic (PLEG): container finished" podID="558104a4-425f-401e-89fd-6338334eb937" containerID="5a7163db511e10c9ab084f1a83a82d5379cf9a9d295beddcb5fea8c8dba886f5" exitCode=0 Dec 16 09:33:18 crc kubenswrapper[4789]: I1216 09:33:18.002591 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pkgj" event={"ID":"558104a4-425f-401e-89fd-6338334eb937","Type":"ContainerDied","Data":"5a7163db511e10c9ab084f1a83a82d5379cf9a9d295beddcb5fea8c8dba886f5"} Dec 16 09:33:19 crc kubenswrapper[4789]: I1216 09:33:19.014481 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pkgj" event={"ID":"558104a4-425f-401e-89fd-6338334eb937","Type":"ContainerStarted","Data":"387aa11be83951ba5188d28123ad0d51fc0ad91c8193351b191e457fbf03d325"} Dec 16 09:33:19 crc kubenswrapper[4789]: I1216 09:33:19.040909 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8pkgj" podStartSLOduration=2.238658514 podStartE2EDuration="7.040892156s" podCreationTimestamp="2025-12-16 09:33:12 +0000 UTC" firstStartedPulling="2025-12-16 09:33:13.955413984 +0000 UTC m=+9732.217301613" lastFinishedPulling="2025-12-16 09:33:18.757647626 +0000 UTC m=+9737.019535255" observedRunningTime="2025-12-16 09:33:19.03815593 +0000 UTC m=+9737.300043559" watchObservedRunningTime="2025-12-16 09:33:19.040892156 +0000 UTC m=+9737.302779785" Dec 16 09:33:20 crc kubenswrapper[4789]: I1216 09:33:20.104965 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:33:20 crc kubenswrapper[4789]: E1216 09:33:20.105598 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:33:22 crc kubenswrapper[4789]: I1216 09:33:22.459835 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:22 crc kubenswrapper[4789]: I1216 09:33:22.460261 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:23 crc kubenswrapper[4789]: I1216 09:33:23.511873 4789 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8pkgj" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="registry-server" probeResult="failure" output=< Dec 16 09:33:23 crc kubenswrapper[4789]: timeout: failed to connect service ":50051" within 1s Dec 16 09:33:23 crc kubenswrapper[4789]: > Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.114819 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-hlm8z_6132cbf3-8a9f-4505-adcc-2e46beb5bf0e/manager/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.241107 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-g2w7k_cc2943a0-fd8f-49bd-bf85-aa6fb274e999/manager/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.272528 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j_e147cdc2-fabb-407d-992a-d4a654c09fa2/util/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.485024 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j_e147cdc2-fabb-407d-992a-d4a654c09fa2/util/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.512387 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j_e147cdc2-fabb-407d-992a-d4a654c09fa2/pull/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.512578 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j_e147cdc2-fabb-407d-992a-d4a654c09fa2/pull/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.706949 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j_e147cdc2-fabb-407d-992a-d4a654c09fa2/util/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.752153 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j_e147cdc2-fabb-407d-992a-d4a654c09fa2/pull/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.773210 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d0d3c6809aa44224aab298086e27935b9dda9ef79c520e38a6c6b641afsrl7j_e147cdc2-fabb-407d-992a-d4a654c09fa2/extract/0.log" Dec 16 09:33:25 crc kubenswrapper[4789]: I1216 09:33:25.924457 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-gkhql_0152085f-c1f6-478c-9044-749eb51fad39/manager/0.log" Dec 16 09:33:26 crc kubenswrapper[4789]: I1216 09:33:26.109306 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-bm8v5_f4d189a6-9923-41a3-be17-a18a76b9d382/manager/0.log" Dec 16 09:33:26 crc kubenswrapper[4789]: I1216 09:33:26.203529 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-2t6n8_dced6ed3-bec9-4abe-a6d8-6e0efaad4f4f/manager/0.log" Dec 16 09:33:26 crc kubenswrapper[4789]: I1216 09:33:26.320327 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-8wxm6_226147eb-5ae9-43a3-8d68-19115b510a2f/manager/0.log" Dec 16 09:33:26 crc kubenswrapper[4789]: I1216 09:33:26.585939 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-f46p9_10da721c-ec68-4b14-b65e-ebf283e4ba59/manager/0.log" Dec 16 09:33:26 crc kubenswrapper[4789]: I1216 09:33:26.942701 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-p29cz_db9badf9-8fa3-484a-8ca4-ffa31c0c29c5/manager/0.log" Dec 16 09:33:26 crc kubenswrapper[4789]: I1216 09:33:26.988770 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-j97qq_d30a0974-7667-4999-9c46-3970ad1a6a8b/manager/0.log" Dec 16 09:33:26 crc kubenswrapper[4789]: I1216 09:33:26.994143 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-pxvzs_5d5d9279-e35b-4b95-be8e-dc54a056e7b5/manager/0.log" Dec 16 09:33:27 crc kubenswrapper[4789]: I1216 09:33:27.188852 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-jfxhw_4e02dba2-7cf2-4cbd-a2f2-b91ddbec517d/manager/0.log" Dec 16 09:33:27 crc kubenswrapper[4789]: I1216 09:33:27.338232 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-72lbq_822ac1df-18a3-4440-bd77-507c589ff693/manager/0.log" Dec 16 09:33:27 crc kubenswrapper[4789]: I1216 09:33:27.561285 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-d6bcb_c7268550-e5d4-4664-b04d-ecfa498cb475/manager/0.log" Dec 16 09:33:27 crc kubenswrapper[4789]: I1216 09:33:27.604243 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-d99lr_d03193ca-0584-4d77-bab4-5e42abf5b5b5/manager/0.log" Dec 16 09:33:27 crc kubenswrapper[4789]: I1216 09:33:27.718445 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-66fff4bf6br444w_a08c1d95-200f-40ce-abef-dbb505570602/manager/0.log" Dec 16 09:33:28 crc kubenswrapper[4789]: I1216 09:33:28.132642 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-69fc74c8bb-hghpf_c1f222bf-e01e-4bd6-a12a-15b726f8bb85/operator/0.log" Dec 16 09:33:28 crc kubenswrapper[4789]: I1216 09:33:28.408345 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-z5pmx_424c1bbc-8b15-4d1e-8988-9f514926d253/registry-server/0.log" Dec 16 09:33:28 crc kubenswrapper[4789]: I1216 09:33:28.523575 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-6ztbz_826f108e-bfd8-43bb-8719-d9a569778578/manager/0.log" Dec 16 09:33:28 crc kubenswrapper[4789]: I1216 09:33:28.690511 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-b6v9v_5db5b7f8-cc13-42b5-9c72-87bf990091d2/manager/0.log" Dec 16 09:33:28 crc kubenswrapper[4789]: I1216 09:33:28.903534 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-k7fwx_26befb39-90f5-4fa1-8f8a-3b82ebae6472/operator/0.log" Dec 16 09:33:29 crc kubenswrapper[4789]: I1216 09:33:29.096556 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-8dfvw_1950613d-02b6-4c9f-925a-e3ece57069ed/manager/0.log" Dec 16 09:33:29 crc kubenswrapper[4789]: I1216 09:33:29.444037 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-f6ndr_a9aa6ddb-befe-472b-bbaf-c17285d7ade4/manager/0.log" Dec 16 09:33:29 crc kubenswrapper[4789]: I1216 09:33:29.559510 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-8vdpj_8fce40f9-3595-4e54-816f-9e567e87ef4b/manager/0.log" Dec 16 09:33:29 crc kubenswrapper[4789]: I1216 09:33:29.891056 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-4xsfm_07ee29dd-a0f9-4a6b-b694-3fbacc25a4e0/manager/0.log" Dec 16 09:33:30 crc kubenswrapper[4789]: I1216 09:33:30.179509 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-678747d7fb-swjqj_467a702b-f3c6-42ef-ba9f-ec19e7d2a291/manager/0.log" Dec 16 09:33:32 crc kubenswrapper[4789]: I1216 09:33:32.514870 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:32 crc kubenswrapper[4789]: I1216 09:33:32.571646 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:32 crc kubenswrapper[4789]: I1216 09:33:32.758622 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pkgj"] Dec 16 09:33:34 crc kubenswrapper[4789]: I1216 09:33:34.205945 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8pkgj" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="registry-server" containerID="cri-o://387aa11be83951ba5188d28123ad0d51fc0ad91c8193351b191e457fbf03d325" gracePeriod=2 Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.105544 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:33:35 crc kubenswrapper[4789]: E1216 09:33:35.106234 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.217850 4789 generic.go:334] "Generic (PLEG): container finished" podID="558104a4-425f-401e-89fd-6338334eb937" containerID="387aa11be83951ba5188d28123ad0d51fc0ad91c8193351b191e457fbf03d325" exitCode=0 Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.217938 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pkgj" event={"ID":"558104a4-425f-401e-89fd-6338334eb937","Type":"ContainerDied","Data":"387aa11be83951ba5188d28123ad0d51fc0ad91c8193351b191e457fbf03d325"} Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.577061 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.702653 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqgfm\" (UniqueName: \"kubernetes.io/projected/558104a4-425f-401e-89fd-6338334eb937-kube-api-access-jqgfm\") pod \"558104a4-425f-401e-89fd-6338334eb937\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.702740 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-utilities\") pod \"558104a4-425f-401e-89fd-6338334eb937\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.702959 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-catalog-content\") pod \"558104a4-425f-401e-89fd-6338334eb937\" (UID: \"558104a4-425f-401e-89fd-6338334eb937\") " Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.703586 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-utilities" (OuterVolumeSpecName: "utilities") pod "558104a4-425f-401e-89fd-6338334eb937" (UID: "558104a4-425f-401e-89fd-6338334eb937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.711979 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558104a4-425f-401e-89fd-6338334eb937-kube-api-access-jqgfm" (OuterVolumeSpecName: "kube-api-access-jqgfm") pod "558104a4-425f-401e-89fd-6338334eb937" (UID: "558104a4-425f-401e-89fd-6338334eb937"). InnerVolumeSpecName "kube-api-access-jqgfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.805480 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqgfm\" (UniqueName: \"kubernetes.io/projected/558104a4-425f-401e-89fd-6338334eb937-kube-api-access-jqgfm\") on node \"crc\" DevicePath \"\"" Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.805514 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.826991 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "558104a4-425f-401e-89fd-6338334eb937" (UID: "558104a4-425f-401e-89fd-6338334eb937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:33:35 crc kubenswrapper[4789]: I1216 09:33:35.907311 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558104a4-425f-401e-89fd-6338334eb937-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:33:36 crc kubenswrapper[4789]: I1216 09:33:36.227768 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pkgj" event={"ID":"558104a4-425f-401e-89fd-6338334eb937","Type":"ContainerDied","Data":"4ea8627030fcd04bad7c751634414c2ca370f5bd7fa97df2697ae0f4cc6a6bfd"} Dec 16 09:33:36 crc kubenswrapper[4789]: I1216 09:33:36.227820 4789 scope.go:117] "RemoveContainer" containerID="387aa11be83951ba5188d28123ad0d51fc0ad91c8193351b191e457fbf03d325" Dec 16 09:33:36 crc kubenswrapper[4789]: I1216 09:33:36.227847 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pkgj" Dec 16 09:33:36 crc kubenswrapper[4789]: I1216 09:33:36.250217 4789 scope.go:117] "RemoveContainer" containerID="5a7163db511e10c9ab084f1a83a82d5379cf9a9d295beddcb5fea8c8dba886f5" Dec 16 09:33:36 crc kubenswrapper[4789]: I1216 09:33:36.253372 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pkgj"] Dec 16 09:33:36 crc kubenswrapper[4789]: I1216 09:33:36.264375 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8pkgj"] Dec 16 09:33:36 crc kubenswrapper[4789]: I1216 09:33:36.680601 4789 scope.go:117] "RemoveContainer" containerID="9ead2433c28cc839c6f973a45cebdd58a83ea1634e4bdc6196162796d0600dac" Dec 16 09:33:38 crc kubenswrapper[4789]: I1216 09:33:38.122098 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558104a4-425f-401e-89fd-6338334eb937" path="/var/lib/kubelet/pods/558104a4-425f-401e-89fd-6338334eb937/volumes" Dec 16 09:33:49 crc kubenswrapper[4789]: I1216 09:33:49.104976 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:33:49 crc kubenswrapper[4789]: E1216 09:33:49.105828 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:33:51 crc kubenswrapper[4789]: I1216 09:33:51.271134 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hx7q5_fa338783-6d00-4150-96d3-03ef1f28eb41/control-plane-machine-set-operator/0.log" Dec 16 09:33:51 crc kubenswrapper[4789]: I1216 09:33:51.438720 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fnph9_8b2c6c23-962c-4829-bd8a-088c7c63dfa4/kube-rbac-proxy/0.log" Dec 16 09:33:51 crc kubenswrapper[4789]: I1216 09:33:51.448675 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fnph9_8b2c6c23-962c-4829-bd8a-088c7c63dfa4/machine-api-operator/0.log" Dec 16 09:34:02 crc kubenswrapper[4789]: I1216 09:34:02.113196 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:34:02 crc kubenswrapper[4789]: E1216 09:34:02.113956 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:34:04 crc kubenswrapper[4789]: I1216 09:34:04.129318 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-6jrv2_b457f25c-782e-4215-9e13-afcbf2c32dc6/cert-manager-controller/0.log" Dec 16 09:34:04 crc kubenswrapper[4789]: I1216 09:34:04.303417 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-vtvzn_3c23c454-c4d1-4e67-bd4a-69e1014e5a5c/cert-manager-webhook/0.log" Dec 16 09:34:04 crc kubenswrapper[4789]: I1216 09:34:04.304957 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-xnqdf_e8660c70-e0a3-4c56-aff9-eccfb4fa297d/cert-manager-cainjector/0.log" Dec 16 09:34:17 crc kubenswrapper[4789]: I1216 09:34:17.105721 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:34:17 crc kubenswrapper[4789]: E1216 09:34:17.106553 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:34:17 crc kubenswrapper[4789]: I1216 09:34:17.543574 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bfwks_2febe2b7-c4da-4fca-bb73-6e4e3bc19c36/nmstate-handler/0.log" Dec 16 09:34:17 crc kubenswrapper[4789]: I1216 09:34:17.562144 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-2gnhj_312b3314-cd6b-422b-910e-9fdf5df3d594/nmstate-console-plugin/0.log" Dec 16 09:34:17 crc kubenswrapper[4789]: I1216 09:34:17.708903 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-t4fl7_ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28/kube-rbac-proxy/0.log" Dec 16 09:34:17 crc kubenswrapper[4789]: I1216 09:34:17.774169 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-t4fl7_ff37692e-6be8-4ebb-b3fe-1a58fcb4ac28/nmstate-metrics/0.log" Dec 16 09:34:17 crc kubenswrapper[4789]: I1216 09:34:17.914649 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-cvp25_31c10535-b6da-4119-a311-1065b2bcb324/nmstate-operator/0.log" Dec 16 09:34:17 crc kubenswrapper[4789]: I1216 09:34:17.937906 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-pl68l_d3e603c6-bac1-496b-bf35-1e8124144121/nmstate-webhook/0.log" Dec 16 09:34:28 crc kubenswrapper[4789]: I1216 09:34:28.105327 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:34:28 crc kubenswrapper[4789]: E1216 09:34:28.106176 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:34:31 crc kubenswrapper[4789]: I1216 09:34:31.298462 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-pjwhk_f1fca8a9-546e-4fa3-b08f-cf5df54303e0/kube-rbac-proxy/0.log" Dec 16 09:34:31 crc kubenswrapper[4789]: I1216 09:34:31.817170 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-pjwhk_f1fca8a9-546e-4fa3-b08f-cf5df54303e0/controller/0.log" Dec 16 09:34:31 crc kubenswrapper[4789]: I1216 09:34:31.962054 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-frr-files/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.128158 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-frr-files/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.196115 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-reloader/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.226763 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-reloader/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.237946 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-metrics/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.409629 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-frr-files/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.414238 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-reloader/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.429446 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-metrics/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.473681 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-metrics/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.620738 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/controller/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.654732 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-reloader/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.659368 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-frr-files/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.661674 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/cp-metrics/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.846522 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/kube-rbac-proxy/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.884881 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/frr-metrics/0.log" Dec 16 09:34:32 crc kubenswrapper[4789]: I1216 09:34:32.908716 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/kube-rbac-proxy-frr/0.log" Dec 16 09:34:33 crc kubenswrapper[4789]: I1216 09:34:33.104989 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/reloader/0.log" Dec 16 09:34:33 crc kubenswrapper[4789]: I1216 09:34:33.124302 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-5spwp_f0535b2c-a6bb-4092-b481-ccba194fd9b4/frr-k8s-webhook-server/0.log" Dec 16 09:34:33 crc kubenswrapper[4789]: I1216 09:34:33.464582 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6bffc6b469-npjk4_14d02d92-8bff-4937-9b63-13592d6626fc/manager/0.log" Dec 16 09:34:33 crc kubenswrapper[4789]: I1216 09:34:33.801239 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zk7nk_fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6/kube-rbac-proxy/0.log" Dec 16 09:34:33 crc kubenswrapper[4789]: I1216 09:34:33.885682 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d79c48cdb-tn64c_0bb94b2b-4f6d-4600-aa95-93751de6c723/webhook-server/0.log" Dec 16 09:34:34 crc kubenswrapper[4789]: I1216 09:34:34.892929 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zk7nk_fdc6fbc9-7246-4b82-a97e-d3b09c3b57e6/speaker/0.log" Dec 16 09:34:36 crc kubenswrapper[4789]: I1216 09:34:36.026506 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rfbh5_d3ae31c7-d2d7-4a0f-b8eb-59d7be857994/frr/0.log" Dec 16 09:34:40 crc kubenswrapper[4789]: I1216 09:34:40.105056 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:34:40 crc kubenswrapper[4789]: E1216 09:34:40.105690 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:34:47 crc kubenswrapper[4789]: I1216 09:34:47.765614 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f_0f18cbd4-42d9-4b83-b929-1cb218f960b4/util/0.log" Dec 16 09:34:47 crc kubenswrapper[4789]: I1216 09:34:47.851997 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f_0f18cbd4-42d9-4b83-b929-1cb218f960b4/util/0.log" Dec 16 09:34:47 crc kubenswrapper[4789]: I1216 09:34:47.915776 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f_0f18cbd4-42d9-4b83-b929-1cb218f960b4/pull/0.log" Dec 16 09:34:47 crc kubenswrapper[4789]: I1216 09:34:47.971637 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f_0f18cbd4-42d9-4b83-b929-1cb218f960b4/pull/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.111954 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f_0f18cbd4-42d9-4b83-b929-1cb218f960b4/util/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.113192 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f_0f18cbd4-42d9-4b83-b929-1cb218f960b4/extract/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.140623 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2nm4f_0f18cbd4-42d9-4b83-b929-1cb218f960b4/pull/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.328220 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk_5167ca38-0011-4e95-81d3-48e193836144/util/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.444671 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk_5167ca38-0011-4e95-81d3-48e193836144/pull/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.477819 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk_5167ca38-0011-4e95-81d3-48e193836144/pull/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.490301 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk_5167ca38-0011-4e95-81d3-48e193836144/util/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.654616 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk_5167ca38-0011-4e95-81d3-48e193836144/pull/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.659804 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk_5167ca38-0011-4e95-81d3-48e193836144/util/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.661166 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ljzpk_5167ca38-0011-4e95-81d3-48e193836144/extract/0.log" Dec 16 09:34:48 crc kubenswrapper[4789]: I1216 09:34:48.825511 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt_27e0476c-3b9a-4129-8376-55b976dbcadc/util/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.046294 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt_27e0476c-3b9a-4129-8376-55b976dbcadc/pull/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.053930 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt_27e0476c-3b9a-4129-8376-55b976dbcadc/pull/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.056324 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt_27e0476c-3b9a-4129-8376-55b976dbcadc/util/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.208977 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt_27e0476c-3b9a-4129-8376-55b976dbcadc/util/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.246073 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt_27e0476c-3b9a-4129-8376-55b976dbcadc/extract/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.257242 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210df9mt_27e0476c-3b9a-4129-8376-55b976dbcadc/pull/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.362318 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64_1dfc67d5-cfb7-4210-8d0a-0b1e87e77127/util/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.923390 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64_1dfc67d5-cfb7-4210-8d0a-0b1e87e77127/util/0.log" Dec 16 09:34:49 crc kubenswrapper[4789]: I1216 09:34:49.989246 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64_1dfc67d5-cfb7-4210-8d0a-0b1e87e77127/pull/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.017541 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64_1dfc67d5-cfb7-4210-8d0a-0b1e87e77127/pull/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.167139 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64_1dfc67d5-cfb7-4210-8d0a-0b1e87e77127/pull/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.182657 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64_1dfc67d5-cfb7-4210-8d0a-0b1e87e77127/util/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.218575 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8jpc64_1dfc67d5-cfb7-4210-8d0a-0b1e87e77127/extract/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.334715 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5xm2_f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4/extract-utilities/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.505279 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5xm2_f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4/extract-utilities/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.539013 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5xm2_f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4/extract-content/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.547121 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5xm2_f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4/extract-content/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.706581 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5xm2_f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4/extract-content/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.732408 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5xm2_f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4/extract-utilities/0.log" Dec 16 09:34:50 crc kubenswrapper[4789]: I1216 09:34:50.928137 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2lwrk_c7339aad-951b-4f0f-8868-44e1d98f5871/extract-utilities/0.log" Dec 16 09:34:51 crc kubenswrapper[4789]: I1216 09:34:51.112582 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:34:51 crc kubenswrapper[4789]: E1216 09:34:51.113206 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:34:51 crc kubenswrapper[4789]: I1216 09:34:51.176741 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2lwrk_c7339aad-951b-4f0f-8868-44e1d98f5871/extract-content/0.log" Dec 16 09:34:51 crc kubenswrapper[4789]: I1216 09:34:51.182301 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2lwrk_c7339aad-951b-4f0f-8868-44e1d98f5871/extract-utilities/0.log" Dec 16 09:34:51 crc kubenswrapper[4789]: I1216 09:34:51.240940 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2lwrk_c7339aad-951b-4f0f-8868-44e1d98f5871/extract-content/0.log" Dec 16 09:34:51 crc kubenswrapper[4789]: I1216 09:34:51.926369 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2lwrk_c7339aad-951b-4f0f-8868-44e1d98f5871/extract-utilities/0.log" Dec 16 09:34:51 crc kubenswrapper[4789]: I1216 09:34:51.962719 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2lwrk_c7339aad-951b-4f0f-8868-44e1d98f5871/extract-content/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.182241 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xktm2_26b72fe5-4aad-4c74-917c-9333d34ea481/marketplace-operator/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.194256 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-m5xm2_f40bfb8a-79f1-4f9b-adfb-e4b93628e2d4/registry-server/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.350540 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jfd8n_e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420/extract-utilities/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.528618 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jfd8n_e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420/extract-utilities/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.555885 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jfd8n_e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420/extract-content/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.570346 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jfd8n_e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420/extract-content/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.784397 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jfd8n_e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420/extract-utilities/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.821186 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jfd8n_e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420/extract-content/0.log" Dec 16 09:34:52 crc kubenswrapper[4789]: I1216 09:34:52.994327 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ncql8_ce2801ec-57a1-436f-afeb-bc9fac03ec0a/extract-utilities/0.log" Dec 16 09:34:53 crc kubenswrapper[4789]: I1216 09:34:53.277556 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jfd8n_e28c35f9-1d8d-4e8c-9fc8-6e9b239a0420/registry-server/0.log" Dec 16 09:34:53 crc kubenswrapper[4789]: I1216 09:34:53.289100 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ncql8_ce2801ec-57a1-436f-afeb-bc9fac03ec0a/extract-content/0.log" Dec 16 09:34:53 crc kubenswrapper[4789]: I1216 09:34:53.300433 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ncql8_ce2801ec-57a1-436f-afeb-bc9fac03ec0a/extract-utilities/0.log" Dec 16 09:34:53 crc kubenswrapper[4789]: I1216 09:34:53.339727 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ncql8_ce2801ec-57a1-436f-afeb-bc9fac03ec0a/extract-content/0.log" Dec 16 09:34:53 crc kubenswrapper[4789]: I1216 09:34:53.351924 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2lwrk_c7339aad-951b-4f0f-8868-44e1d98f5871/registry-server/0.log" Dec 16 09:34:53 crc kubenswrapper[4789]: I1216 09:34:53.550592 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ncql8_ce2801ec-57a1-436f-afeb-bc9fac03ec0a/extract-content/0.log" Dec 16 09:34:53 crc kubenswrapper[4789]: I1216 09:34:53.563196 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ncql8_ce2801ec-57a1-436f-afeb-bc9fac03ec0a/extract-utilities/0.log" Dec 16 09:34:54 crc kubenswrapper[4789]: I1216 09:34:54.768031 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ncql8_ce2801ec-57a1-436f-afeb-bc9fac03ec0a/registry-server/0.log" Dec 16 09:35:04 crc kubenswrapper[4789]: I1216 09:35:04.106080 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:35:04 crc kubenswrapper[4789]: E1216 09:35:04.106959 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:35:05 crc kubenswrapper[4789]: I1216 09:35:05.879367 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-klwsc_83b9a811-bc86-44be-a5e3-bac352d1f377/prometheus-operator/0.log" Dec 16 09:35:05 crc kubenswrapper[4789]: I1216 09:35:05.987287 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c85d754c7-r28lm_b84c157e-f7a3-4b07-acdb-0f833aa4bdc3/prometheus-operator-admission-webhook/0.log" Dec 16 09:35:06 crc kubenswrapper[4789]: I1216 09:35:06.086286 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c85d754c7-zjsms_82adb0c3-8b98-4764-b8ca-11eb3c373f16/prometheus-operator-admission-webhook/0.log" Dec 16 09:35:06 crc kubenswrapper[4789]: I1216 09:35:06.193846 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-hktzx_cab75a30-cb53-4af1-9236-b475e66dcaec/operator/0.log" Dec 16 09:35:06 crc kubenswrapper[4789]: I1216 09:35:06.292666 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-nmtvf_df255ee7-fed8-4845-a2af-49497297cfd4/perses-operator/0.log" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.713873 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgvkk"] Dec 16 09:35:08 crc kubenswrapper[4789]: E1216 09:35:08.714929 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="extract-content" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.714948 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="extract-content" Dec 16 09:35:08 crc kubenswrapper[4789]: E1216 09:35:08.714975 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="registry-server" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.714983 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="registry-server" Dec 16 09:35:08 crc kubenswrapper[4789]: E1216 09:35:08.715004 4789 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="extract-utilities" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.715013 4789 state_mem.go:107] "Deleted CPUSet assignment" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="extract-utilities" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.715280 4789 memory_manager.go:354] "RemoveStaleState removing state" podUID="558104a4-425f-401e-89fd-6338334eb937" containerName="registry-server" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.717172 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.731112 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgvkk"] Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.812472 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-utilities\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.812531 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-catalog-content\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.812586 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cls9m\" (UniqueName: \"kubernetes.io/projected/aff7102c-027c-40ae-bab0-e4ef11d9b11a-kube-api-access-cls9m\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.914441 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-utilities\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.914524 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-catalog-content\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.914598 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cls9m\" (UniqueName: \"kubernetes.io/projected/aff7102c-027c-40ae-bab0-e4ef11d9b11a-kube-api-access-cls9m\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.915080 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-utilities\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.915446 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-catalog-content\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:08 crc kubenswrapper[4789]: I1216 09:35:08.937839 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cls9m\" (UniqueName: \"kubernetes.io/projected/aff7102c-027c-40ae-bab0-e4ef11d9b11a-kube-api-access-cls9m\") pod \"community-operators-kgvkk\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:09 crc kubenswrapper[4789]: I1216 09:35:09.036579 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:09 crc kubenswrapper[4789]: I1216 09:35:09.639849 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgvkk"] Dec 16 09:35:10 crc kubenswrapper[4789]: I1216 09:35:10.119806 4789 generic.go:334] "Generic (PLEG): container finished" podID="aff7102c-027c-40ae-bab0-e4ef11d9b11a" containerID="8b68aadb3ca7402fd7c7453dc8bcc4b724175e15dee11faa0664e88a5c50887a" exitCode=0 Dec 16 09:35:10 crc kubenswrapper[4789]: I1216 09:35:10.119889 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgvkk" event={"ID":"aff7102c-027c-40ae-bab0-e4ef11d9b11a","Type":"ContainerDied","Data":"8b68aadb3ca7402fd7c7453dc8bcc4b724175e15dee11faa0664e88a5c50887a"} Dec 16 09:35:10 crc kubenswrapper[4789]: I1216 09:35:10.120228 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgvkk" event={"ID":"aff7102c-027c-40ae-bab0-e4ef11d9b11a","Type":"ContainerStarted","Data":"a2669ca55633ed7c5e5a34e0861246e2147c8861f2fb0db55f3c70d7be2c434e"} Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.121988 4789 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h5cn5"] Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.124553 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.159487 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5cn5"] Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.189885 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-catalog-content\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.190023 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbx4\" (UniqueName: \"kubernetes.io/projected/d4aab0f9-3c8a-45c5-8448-29e214b43dde-kube-api-access-dtbx4\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.190128 4789 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-utilities\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.292402 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-catalog-content\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.292720 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbx4\" (UniqueName: \"kubernetes.io/projected/d4aab0f9-3c8a-45c5-8448-29e214b43dde-kube-api-access-dtbx4\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.292898 4789 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-utilities\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.293430 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-utilities\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.295649 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-catalog-content\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.666742 4789 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbx4\" (UniqueName: \"kubernetes.io/projected/d4aab0f9-3c8a-45c5-8448-29e214b43dde-kube-api-access-dtbx4\") pod \"redhat-marketplace-h5cn5\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:11 crc kubenswrapper[4789]: I1216 09:35:11.750038 4789 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:12 crc kubenswrapper[4789]: I1216 09:35:12.321871 4789 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5cn5"] Dec 16 09:35:13 crc kubenswrapper[4789]: I1216 09:35:13.154726 4789 generic.go:334] "Generic (PLEG): container finished" podID="d4aab0f9-3c8a-45c5-8448-29e214b43dde" containerID="d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2" exitCode=0 Dec 16 09:35:13 crc kubenswrapper[4789]: I1216 09:35:13.156045 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5cn5" event={"ID":"d4aab0f9-3c8a-45c5-8448-29e214b43dde","Type":"ContainerDied","Data":"d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2"} Dec 16 09:35:13 crc kubenswrapper[4789]: I1216 09:35:13.156447 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5cn5" event={"ID":"d4aab0f9-3c8a-45c5-8448-29e214b43dde","Type":"ContainerStarted","Data":"b8ad3010429d535574361df6edf5532797a52ad9d627bbe663ef7db87b62a6a2"} Dec 16 09:35:13 crc kubenswrapper[4789]: I1216 09:35:13.160606 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgvkk" event={"ID":"aff7102c-027c-40ae-bab0-e4ef11d9b11a","Type":"ContainerStarted","Data":"7479daa22188ed73a726ef498fb288b72b0b4797f1d44dd1fe6c1707df041d25"} Dec 16 09:35:14 crc kubenswrapper[4789]: I1216 09:35:14.172306 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5cn5" event={"ID":"d4aab0f9-3c8a-45c5-8448-29e214b43dde","Type":"ContainerStarted","Data":"e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91"} Dec 16 09:35:14 crc kubenswrapper[4789]: I1216 09:35:14.179849 4789 generic.go:334] "Generic (PLEG): container finished" podID="aff7102c-027c-40ae-bab0-e4ef11d9b11a" containerID="7479daa22188ed73a726ef498fb288b72b0b4797f1d44dd1fe6c1707df041d25" exitCode=0 Dec 16 09:35:14 crc kubenswrapper[4789]: I1216 09:35:14.179884 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgvkk" event={"ID":"aff7102c-027c-40ae-bab0-e4ef11d9b11a","Type":"ContainerDied","Data":"7479daa22188ed73a726ef498fb288b72b0b4797f1d44dd1fe6c1707df041d25"} Dec 16 09:35:16 crc kubenswrapper[4789]: I1216 09:35:16.205326 4789 generic.go:334] "Generic (PLEG): container finished" podID="d4aab0f9-3c8a-45c5-8448-29e214b43dde" containerID="e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91" exitCode=0 Dec 16 09:35:16 crc kubenswrapper[4789]: I1216 09:35:16.205384 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5cn5" event={"ID":"d4aab0f9-3c8a-45c5-8448-29e214b43dde","Type":"ContainerDied","Data":"e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91"} Dec 16 09:35:16 crc kubenswrapper[4789]: I1216 09:35:16.210083 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgvkk" event={"ID":"aff7102c-027c-40ae-bab0-e4ef11d9b11a","Type":"ContainerStarted","Data":"e05f6121ee0bc644e332ef971a9ab43f49c190e2ff2dba7a3791d004f2e7046f"} Dec 16 09:35:16 crc kubenswrapper[4789]: I1216 09:35:16.248435 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgvkk" podStartSLOduration=2.467488786 podStartE2EDuration="8.248416259s" podCreationTimestamp="2025-12-16 09:35:08 +0000 UTC" firstStartedPulling="2025-12-16 09:35:10.121763069 +0000 UTC m=+9848.383650688" lastFinishedPulling="2025-12-16 09:35:15.902690532 +0000 UTC m=+9854.164578161" observedRunningTime="2025-12-16 09:35:16.247111958 +0000 UTC m=+9854.508999607" watchObservedRunningTime="2025-12-16 09:35:16.248416259 +0000 UTC m=+9854.510303888" Dec 16 09:35:17 crc kubenswrapper[4789]: I1216 09:35:17.221959 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5cn5" event={"ID":"d4aab0f9-3c8a-45c5-8448-29e214b43dde","Type":"ContainerStarted","Data":"af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5"} Dec 16 09:35:17 crc kubenswrapper[4789]: I1216 09:35:17.245438 4789 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h5cn5" podStartSLOduration=2.560240891 podStartE2EDuration="6.245413889s" podCreationTimestamp="2025-12-16 09:35:11 +0000 UTC" firstStartedPulling="2025-12-16 09:35:13.158095715 +0000 UTC m=+9851.419983344" lastFinishedPulling="2025-12-16 09:35:16.843268713 +0000 UTC m=+9855.105156342" observedRunningTime="2025-12-16 09:35:17.23685224 +0000 UTC m=+9855.498739869" watchObservedRunningTime="2025-12-16 09:35:17.245413889 +0000 UTC m=+9855.507301528" Dec 16 09:35:19 crc kubenswrapper[4789]: I1216 09:35:19.037621 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:19 crc kubenswrapper[4789]: I1216 09:35:19.037696 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:19 crc kubenswrapper[4789]: I1216 09:35:19.087496 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:19 crc kubenswrapper[4789]: I1216 09:35:19.105123 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:35:19 crc kubenswrapper[4789]: E1216 09:35:19.105399 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:35:21 crc kubenswrapper[4789]: I1216 09:35:21.750741 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:21 crc kubenswrapper[4789]: I1216 09:35:21.751357 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:21 crc kubenswrapper[4789]: I1216 09:35:21.796803 4789 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:22 crc kubenswrapper[4789]: I1216 09:35:22.520636 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:25 crc kubenswrapper[4789]: I1216 09:35:25.504699 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5cn5"] Dec 16 09:35:25 crc kubenswrapper[4789]: I1216 09:35:25.505518 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h5cn5" podUID="d4aab0f9-3c8a-45c5-8448-29e214b43dde" containerName="registry-server" containerID="cri-o://af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5" gracePeriod=2 Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.075716 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.236135 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-utilities\") pod \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.236322 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-catalog-content\") pod \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.236354 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbx4\" (UniqueName: \"kubernetes.io/projected/d4aab0f9-3c8a-45c5-8448-29e214b43dde-kube-api-access-dtbx4\") pod \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\" (UID: \"d4aab0f9-3c8a-45c5-8448-29e214b43dde\") " Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.237140 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-utilities" (OuterVolumeSpecName: "utilities") pod "d4aab0f9-3c8a-45c5-8448-29e214b43dde" (UID: "d4aab0f9-3c8a-45c5-8448-29e214b43dde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.247658 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4aab0f9-3c8a-45c5-8448-29e214b43dde-kube-api-access-dtbx4" (OuterVolumeSpecName: "kube-api-access-dtbx4") pod "d4aab0f9-3c8a-45c5-8448-29e214b43dde" (UID: "d4aab0f9-3c8a-45c5-8448-29e214b43dde"). InnerVolumeSpecName "kube-api-access-dtbx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.286679 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4aab0f9-3c8a-45c5-8448-29e214b43dde" (UID: "d4aab0f9-3c8a-45c5-8448-29e214b43dde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.320825 4789 generic.go:334] "Generic (PLEG): container finished" podID="d4aab0f9-3c8a-45c5-8448-29e214b43dde" containerID="af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5" exitCode=0 Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.320933 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5cn5" event={"ID":"d4aab0f9-3c8a-45c5-8448-29e214b43dde","Type":"ContainerDied","Data":"af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5"} Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.320976 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h5cn5" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.321017 4789 scope.go:117] "RemoveContainer" containerID="af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.320984 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h5cn5" event={"ID":"d4aab0f9-3c8a-45c5-8448-29e214b43dde","Type":"ContainerDied","Data":"b8ad3010429d535574361df6edf5532797a52ad9d627bbe663ef7db87b62a6a2"} Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.340315 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.340356 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbx4\" (UniqueName: \"kubernetes.io/projected/d4aab0f9-3c8a-45c5-8448-29e214b43dde-kube-api-access-dtbx4\") on node \"crc\" DevicePath \"\"" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.340369 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4aab0f9-3c8a-45c5-8448-29e214b43dde-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.355292 4789 scope.go:117] "RemoveContainer" containerID="e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.366063 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5cn5"] Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.376167 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h5cn5"] Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.384677 4789 scope.go:117] "RemoveContainer" containerID="d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.432108 4789 scope.go:117] "RemoveContainer" containerID="af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5" Dec 16 09:35:26 crc kubenswrapper[4789]: E1216 09:35:26.432709 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5\": container with ID starting with af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5 not found: ID does not exist" containerID="af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.432749 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5"} err="failed to get container status \"af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5\": rpc error: code = NotFound desc = could not find container \"af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5\": container with ID starting with af743ca8933412b78731a3fc61fffa099963286fb08f98ced533c4a414bc4fb5 not found: ID does not exist" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.432776 4789 scope.go:117] "RemoveContainer" containerID="e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91" Dec 16 09:35:26 crc kubenswrapper[4789]: E1216 09:35:26.433231 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91\": container with ID starting with e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91 not found: ID does not exist" containerID="e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.433259 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91"} err="failed to get container status \"e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91\": rpc error: code = NotFound desc = could not find container \"e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91\": container with ID starting with e4a84fc7cee02553ed624368394de24ecd3777796421fa52319155ed9c7e7c91 not found: ID does not exist" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.433279 4789 scope.go:117] "RemoveContainer" containerID="d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2" Dec 16 09:35:26 crc kubenswrapper[4789]: E1216 09:35:26.433544 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2\": container with ID starting with d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2 not found: ID does not exist" containerID="d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2" Dec 16 09:35:26 crc kubenswrapper[4789]: I1216 09:35:26.433571 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2"} err="failed to get container status \"d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2\": rpc error: code = NotFound desc = could not find container \"d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2\": container with ID starting with d5424dc193b3221f0022d513ed37e55b692a7eb72f1d5c64351a369f132ae4c2 not found: ID does not exist" Dec 16 09:35:28 crc kubenswrapper[4789]: I1216 09:35:28.116781 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4aab0f9-3c8a-45c5-8448-29e214b43dde" path="/var/lib/kubelet/pods/d4aab0f9-3c8a-45c5-8448-29e214b43dde/volumes" Dec 16 09:35:29 crc kubenswrapper[4789]: I1216 09:35:29.089045 4789 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:29 crc kubenswrapper[4789]: I1216 09:35:29.140518 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgvkk"] Dec 16 09:35:29 crc kubenswrapper[4789]: I1216 09:35:29.349105 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgvkk" podUID="aff7102c-027c-40ae-bab0-e4ef11d9b11a" containerName="registry-server" containerID="cri-o://e05f6121ee0bc644e332ef971a9ab43f49c190e2ff2dba7a3791d004f2e7046f" gracePeriod=2 Dec 16 09:35:30 crc kubenswrapper[4789]: I1216 09:35:30.106603 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:35:30 crc kubenswrapper[4789]: E1216 09:35:30.107546 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:35:30 crc kubenswrapper[4789]: I1216 09:35:30.360384 4789 generic.go:334] "Generic (PLEG): container finished" podID="aff7102c-027c-40ae-bab0-e4ef11d9b11a" containerID="e05f6121ee0bc644e332ef971a9ab43f49c190e2ff2dba7a3791d004f2e7046f" exitCode=0 Dec 16 09:35:30 crc kubenswrapper[4789]: I1216 09:35:30.360438 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgvkk" event={"ID":"aff7102c-027c-40ae-bab0-e4ef11d9b11a","Type":"ContainerDied","Data":"e05f6121ee0bc644e332ef971a9ab43f49c190e2ff2dba7a3791d004f2e7046f"} Dec 16 09:35:30 crc kubenswrapper[4789]: I1216 09:35:30.978886 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.072839 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-catalog-content\") pod \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.072998 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cls9m\" (UniqueName: \"kubernetes.io/projected/aff7102c-027c-40ae-bab0-e4ef11d9b11a-kube-api-access-cls9m\") pod \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.073073 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-utilities\") pod \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\" (UID: \"aff7102c-027c-40ae-bab0-e4ef11d9b11a\") " Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.073787 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-utilities" (OuterVolumeSpecName: "utilities") pod "aff7102c-027c-40ae-bab0-e4ef11d9b11a" (UID: "aff7102c-027c-40ae-bab0-e4ef11d9b11a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.074184 4789 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.080700 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff7102c-027c-40ae-bab0-e4ef11d9b11a-kube-api-access-cls9m" (OuterVolumeSpecName: "kube-api-access-cls9m") pod "aff7102c-027c-40ae-bab0-e4ef11d9b11a" (UID: "aff7102c-027c-40ae-bab0-e4ef11d9b11a"). InnerVolumeSpecName "kube-api-access-cls9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.135589 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aff7102c-027c-40ae-bab0-e4ef11d9b11a" (UID: "aff7102c-027c-40ae-bab0-e4ef11d9b11a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.176351 4789 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aff7102c-027c-40ae-bab0-e4ef11d9b11a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.176659 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cls9m\" (UniqueName: \"kubernetes.io/projected/aff7102c-027c-40ae-bab0-e4ef11d9b11a-kube-api-access-cls9m\") on node \"crc\" DevicePath \"\"" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.372620 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgvkk" event={"ID":"aff7102c-027c-40ae-bab0-e4ef11d9b11a","Type":"ContainerDied","Data":"a2669ca55633ed7c5e5a34e0861246e2147c8861f2fb0db55f3c70d7be2c434e"} Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.372680 4789 scope.go:117] "RemoveContainer" containerID="e05f6121ee0bc644e332ef971a9ab43f49c190e2ff2dba7a3791d004f2e7046f" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.372878 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgvkk" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.403160 4789 scope.go:117] "RemoveContainer" containerID="7479daa22188ed73a726ef498fb288b72b0b4797f1d44dd1fe6c1707df041d25" Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.414849 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgvkk"] Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.427164 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kgvkk"] Dec 16 09:35:31 crc kubenswrapper[4789]: I1216 09:35:31.979333 4789 scope.go:117] "RemoveContainer" containerID="8b68aadb3ca7402fd7c7453dc8bcc4b724175e15dee11faa0664e88a5c50887a" Dec 16 09:35:32 crc kubenswrapper[4789]: I1216 09:35:32.118674 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aff7102c-027c-40ae-bab0-e4ef11d9b11a" path="/var/lib/kubelet/pods/aff7102c-027c-40ae-bab0-e4ef11d9b11a/volumes" Dec 16 09:35:36 crc kubenswrapper[4789]: I1216 09:35:36.736135 4789 scope.go:117] "RemoveContainer" containerID="6beab4ff81f0645ca005dbccb3bf431ce3da9e7c99403472daf6f1738b84d683" Dec 16 09:35:41 crc kubenswrapper[4789]: I1216 09:35:41.104972 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:35:41 crc kubenswrapper[4789]: E1216 09:35:41.105760 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:35:53 crc kubenswrapper[4789]: I1216 09:35:53.105598 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:35:53 crc kubenswrapper[4789]: E1216 09:35:53.106399 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:36:07 crc kubenswrapper[4789]: I1216 09:36:07.105706 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:36:07 crc kubenswrapper[4789]: E1216 09:36:07.106616 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:36:21 crc kubenswrapper[4789]: I1216 09:36:21.106974 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:36:21 crc kubenswrapper[4789]: E1216 09:36:21.107941 4789 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pdg87_openshift-machine-config-operator(ca24a4b9-4b99-4de7-887d-f8804a4f06bb)\"" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" Dec 16 09:36:34 crc kubenswrapper[4789]: I1216 09:36:34.109945 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc" Dec 16 09:36:35 crc kubenswrapper[4789]: I1216 09:36:35.048868 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"a37d29240bd95970fe895937067f9cc3dd549ece0879f34ed4d844b92aab4919"} Dec 16 09:37:14 crc kubenswrapper[4789]: I1216 09:37:14.408354 4789 generic.go:334] "Generic (PLEG): container finished" podID="3f332980-1129-4e30-a022-8ab9019c060b" containerID="edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6" exitCode=0 Dec 16 09:37:14 crc kubenswrapper[4789]: I1216 09:37:14.408818 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" event={"ID":"3f332980-1129-4e30-a022-8ab9019c060b","Type":"ContainerDied","Data":"edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6"} Dec 16 09:37:14 crc kubenswrapper[4789]: I1216 09:37:14.409478 4789 scope.go:117] "RemoveContainer" containerID="edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6" Dec 16 09:37:14 crc kubenswrapper[4789]: I1216 09:37:14.743098 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vpf4l_must-gather-kz8nt_3f332980-1129-4e30-a022-8ab9019c060b/gather/0.log" Dec 16 09:37:22 crc kubenswrapper[4789]: I1216 09:37:22.998446 4789 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vpf4l/must-gather-kz8nt"] Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:22.999799 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" podUID="3f332980-1129-4e30-a022-8ab9019c060b" containerName="copy" containerID="cri-o://a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b" gracePeriod=2 Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.011539 4789 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vpf4l/must-gather-kz8nt"] Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.475457 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vpf4l_must-gather-kz8nt_3f332980-1129-4e30-a022-8ab9019c060b/copy/0.log" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.475809 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.490622 4789 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vpf4l_must-gather-kz8nt_3f332980-1129-4e30-a022-8ab9019c060b/copy/0.log" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.490997 4789 generic.go:334] "Generic (PLEG): container finished" podID="3f332980-1129-4e30-a022-8ab9019c060b" containerID="a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b" exitCode=143 Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.491044 4789 scope.go:117] "RemoveContainer" containerID="a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.491173 4789 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vpf4l/must-gather-kz8nt" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.499512 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmszc\" (UniqueName: \"kubernetes.io/projected/3f332980-1129-4e30-a022-8ab9019c060b-kube-api-access-gmszc\") pod \"3f332980-1129-4e30-a022-8ab9019c060b\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.499625 4789 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f332980-1129-4e30-a022-8ab9019c060b-must-gather-output\") pod \"3f332980-1129-4e30-a022-8ab9019c060b\" (UID: \"3f332980-1129-4e30-a022-8ab9019c060b\") " Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.514766 4789 scope.go:117] "RemoveContainer" containerID="edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.515263 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f332980-1129-4e30-a022-8ab9019c060b-kube-api-access-gmszc" (OuterVolumeSpecName: "kube-api-access-gmszc") pod "3f332980-1129-4e30-a022-8ab9019c060b" (UID: "3f332980-1129-4e30-a022-8ab9019c060b"). InnerVolumeSpecName "kube-api-access-gmszc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.594794 4789 scope.go:117] "RemoveContainer" containerID="a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b" Dec 16 09:37:23 crc kubenswrapper[4789]: E1216 09:37:23.596002 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b\": container with ID starting with a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b not found: ID does not exist" containerID="a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.596074 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b"} err="failed to get container status \"a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b\": rpc error: code = NotFound desc = could not find container \"a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b\": container with ID starting with a9ebb716a9a4151ab85c93276b1a3d55b248cdcee85a369e8cc4ab795e85ca7b not found: ID does not exist" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.596101 4789 scope.go:117] "RemoveContainer" containerID="edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6" Dec 16 09:37:23 crc kubenswrapper[4789]: E1216 09:37:23.596458 4789 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6\": container with ID starting with edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6 not found: ID does not exist" containerID="edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.596525 4789 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6"} err="failed to get container status \"edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6\": rpc error: code = NotFound desc = could not find container \"edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6\": container with ID starting with edb8cada30143e31f211cb798db6fc6df102ab8de10331c31b6d6d41515de8a6 not found: ID does not exist" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.614854 4789 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmszc\" (UniqueName: \"kubernetes.io/projected/3f332980-1129-4e30-a022-8ab9019c060b-kube-api-access-gmszc\") on node \"crc\" DevicePath \"\"" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.708064 4789 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f332980-1129-4e30-a022-8ab9019c060b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3f332980-1129-4e30-a022-8ab9019c060b" (UID: "3f332980-1129-4e30-a022-8ab9019c060b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 09:37:23 crc kubenswrapper[4789]: I1216 09:37:23.717535 4789 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f332980-1129-4e30-a022-8ab9019c060b-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 09:37:23 crc kubenswrapper[4789]: E1216 09:37:23.921785 4789 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f332980_1129_4e30_a022_8ab9019c060b.slice\": RecentStats: unable to find data in memory cache]" Dec 16 09:37:24 crc kubenswrapper[4789]: I1216 09:37:24.116424 4789 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f332980-1129-4e30-a022-8ab9019c060b" path="/var/lib/kubelet/pods/3f332980-1129-4e30-a022-8ab9019c060b/volumes" Dec 16 09:38:51 crc kubenswrapper[4789]: I1216 09:38:51.928144 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:38:51 crc kubenswrapper[4789]: I1216 09:38:51.928761 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:39:21 crc kubenswrapper[4789]: I1216 09:39:21.928355 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:39:21 crc kubenswrapper[4789]: I1216 09:39:21.929031 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:39:51 crc kubenswrapper[4789]: I1216 09:39:51.929064 4789 patch_prober.go:28] interesting pod/machine-config-daemon-pdg87 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 09:39:51 crc kubenswrapper[4789]: I1216 09:39:51.929976 4789 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 09:39:51 crc kubenswrapper[4789]: I1216 09:39:51.930209 4789 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" Dec 16 09:39:51 crc kubenswrapper[4789]: I1216 09:39:51.931243 4789 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a37d29240bd95970fe895937067f9cc3dd549ece0879f34ed4d844b92aab4919"} pod="openshift-machine-config-operator/machine-config-daemon-pdg87" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 09:39:51 crc kubenswrapper[4789]: I1216 09:39:51.931307 4789 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" podUID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerName="machine-config-daemon" containerID="cri-o://a37d29240bd95970fe895937067f9cc3dd549ece0879f34ed4d844b92aab4919" gracePeriod=600 Dec 16 09:39:52 crc kubenswrapper[4789]: I1216 09:39:52.873174 4789 generic.go:334] "Generic (PLEG): container finished" podID="ca24a4b9-4b99-4de7-887d-f8804a4f06bb" containerID="a37d29240bd95970fe895937067f9cc3dd549ece0879f34ed4d844b92aab4919" exitCode=0 Dec 16 09:39:52 crc kubenswrapper[4789]: I1216 09:39:52.873322 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerDied","Data":"a37d29240bd95970fe895937067f9cc3dd549ece0879f34ed4d844b92aab4919"} Dec 16 09:39:52 crc kubenswrapper[4789]: I1216 09:39:52.873866 4789 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pdg87" event={"ID":"ca24a4b9-4b99-4de7-887d-f8804a4f06bb","Type":"ContainerStarted","Data":"2ad6e26deb9c24f761e4fbfdc161ee787ba10bf1d55778a010acef67bff75866"} Dec 16 09:39:52 crc kubenswrapper[4789]: I1216 09:39:52.873892 4789 scope.go:117] "RemoveContainer" containerID="0d067fa6a9391ccf5ab8091adb67b0a49526e5a3c24c93b2d4d3eb90050e6afc"